|

Workshop 5: Proposal for a regulation laying down rules to prevent and combat child sexual abuse

Rapporteur: Francesco Vecchi
 

1 Leave a comment on paragraph 1 2 The panel analysed the EU commission proposal for a Regulation to Prevent and Combat Child Sexual Abuse. First, this proposal implies regulations different from civil society and NGOs. Of course something must be done to protect children, and there is still much to discuss about the detection order, and its massive impact on cybersecurity. But these discussions also need other solutions.

2 Leave a comment on paragraph 2 3 Indeed, 59% of child sexual abuse contents removed in the last year was hosted in the EU Member States, which makes Europe the global leader in this domain, hosting ⅔ of all cases. Self-generated child sexual abuse material is increasingly common, and this phenomenon affects not only 11-13 years old girls, but also 7-10. Private companies are able to detect child sexual abuse material because they can do so following privacy directives and the communication code. Clearly, this is not the case of political authorities.

3 Leave a comment on paragraph 3 2 Even though something must be done, it is also important to discuss the implementation of these measures. The main goal of the current proposal is to protect children from sexual abuse and exploitation with an online element. However, there are several issues. First, right now detection is working on a temporary derogation, whose end of term will leave children unprotected. Second, the actual mechanisms will produce an overload of false-positives, which would cause a “cry wolf” situation. Third, the EU center is also in charge of producing a list of indicators, but there is no explanation of how the EU center will cope with the two databases he will work with (EU and US), nor about how they merge with each other, possibly overlapping. Finally, it is not clear how the EU center will work with the hotlines, which had an important role in combating child sexual abuse so far and actively removed a significant amount of content from the internet. All in all, the intentions are good, but their implementation raises doubts whether the drafters of the proposals have consulted tech experts, fundamental rights experts and child rights experts.

4 Leave a comment on paragraph 4 2 We must acknowledge that digital services and the Internet were created without considering children as users, but they are one third of the users of the Internet. This is why the obligatory risk assessment and mitigation is so crucial. Indeed, the solution is not to exclude children from services: the right to access media was declared in Art. 25 of the Child Rights Regulation. What is still to be understood, is how we can know the users’ age, because the greatest amount of mitigation measures is based on this information. In this sense, regulations for private companies are desperately needed, especially to balance their power in detection mechanisms, given that law enforcement authorities may not have the resources to deal with false positives.

5 Leave a comment on paragraph 5 3 During the discussion, doubts were raised on the appropriateness of using technological solutions for a social problem, especially when these technologies can withdraw the right to private communication. Some participants think that, even though Privacy Rights are important, they are qualified rights, while Child Rights are fundamental rights. Therefore, this argument cannot be the only criticism against these measures, also because detection measures are already used by law enforcement authorities. Nevertheless, this issue remains a social problem, and investments should be focused on prevention measures.

6 Leave a comment on paragraph 6 3 Finally, doubts and difficulties are not a reason to not act. Indeed, false-positives will be asses by humans and their quality will be checked by other humans. Of course, when it comes to surveillance, it is a problematic topic. On the one hand, if criminals are aware of the surveillance that has been placed on them, they will try to find ways to avoid it, and people being placed under surveillance are most likely to be perpetrating these actions. Second, this kind of surveillance is a slippery slope to censorship measures, and while the EU is a democratic institution, this cannot be said for all of its Member States, which raises concerns about the implementation of such measures. It was argued that once this intrusive law forcing certain companies to break encryption and allow access and monitoring of messages, it may be used for other needs. Therefore, this proposal should be modified to avoid intrusive measures.

Source: https://comment.eurodig.org/eurodig-2023-messages/workshops/workshop-5-proposal-for-a-regulation-laying-down-rules-to-prevent-and-combat-child-sexual-abuse/