
Before delving into the analysis, it is worth pointing out that this contribution is the first part of a more extensive research.
In the next article, we will explore the conflict between automatic detection algorithms and end-to-end encryption (E2EE), analyzing how fundamental rights and ECtHR case law resist the introduction of “ backdoors ” or client-side scanning systems.
The evolution of cyberspace is radically transforming cybercrime law. After thirty years of observing digital crimes, it’s clear that we’re not just witnessing minor regulatory adjustments, but a genuine shift in the approach to intermediaries’ liability. (ISP, social, messaging apps).
While the judicial police once intervened “post-factum ” with digital forensics to identify child pornography exchange nodes, the proposed Regulation COM/2022/209 (Child Sexual Abuse Regulation) shifts the focus upstream. It aims to institutionalize preventive monitoring that, until recently, was considered technically invasive and legally unacceptable.
From the perspective of the hierarchy of legal sources, the proposed Child Sexual Abuse Regulation (CSAR) presents itself as a lex specialis compared to the Digital Services Act (DSA). While the DSA is the general “framework” regulating digital services in Europe, the new CSAR legislation seeks to introduce much more specific and preventive vertical obligations. The European Commission justifies this move with the need to harmonize the single market (Art. 114 TFEU) , but for those involved in courtrooms on a daily basis, it is clear that the objective is not economic, but public safety. This raises a conflict between the Union’s competences and the sovereignty of individual states in criminal matters.
We are at the twilight of the era of self-regulation. The proposed system goes beyond simple voluntary collaboration by providers and introduces so-called detection orders . It’s no longer a matter of spontaneously reporting abuse, but rather a legal obligation requiring providers to implement scanning technologies. In this way, the provider’s nature changes. From a passive host of data , it becomes an active supervisory agent acting on behalf of public authorities. This marks the definitive shift from the “removal upon report” requirement to a structural surveillance system directly integrated into business processes and software code.
The heart of the political conflict concerns the exemption from the ePrivacy Directive. Currently, thanks to Regulation 2021/1232, providers can voluntarily scan metadata and content for child sexual abuse material. However, this is a temporary measure (extended until 2026) created in an emergency climate. The risk, typical of many emergency measures, is that this exception becomes the rule, crystallizing a permanent surveillance framework that progressively erodes the privacy of digital communications.
The proposal calls for the creation of the EU Centre, a new central body with powers that go far beyond technical support. This agency will manage hash databases (the “fingerprints” of known illicit files) and will be responsible for validating the Artificial Intelligence algorithms used to identify new content or suspicious behavior ( grooming ). Entrusting a technical body with the management of processes that impact the freedoms of millions of users is a huge challenge for European administrative law. It creates a supranational filter that operates even before the police or the judiciary, escaping the traditional canons of national judicial oversight.
The fight against Child Sexual Abuse Material (CSAM) is, by definition, cross-border: data jumps from one server to another within milliseconds across different jurisdictions. Clearly, one state alone cannot achieve this. However, investigative efficiency cannot be the sole yardstick in a constitutional state. The debate on subsidiarity forces us to ask whether the European Union can go so far as to violate the inviolability of correspondence and the confidentiality of private communications, rights that require constitutional guarantees that no “technical requirement” should be able to override.
The road to 2026 remains rocky. Negotiations between the Commission, Council, and Parliament demonstrate an attempt to find a compromise between “total security” and “digital civil rights .” The fear is that the providers’ “assisted voluntariness ” could become indirect coercion.
If you don’t implement scanning systems, you risk administrative fines or very heavy civil liability. For a lawyer, this is a form of pressure that pushes private individuals to become network controllers purely for economic risk management purposes.
The CSAR proposal is the final test for European digital democracy. We must understand whether we are capable of combating heinous crimes without succumbing to the temptation of mass surveillance. The Union’s maturity will be measured by its ability not to sacrifice individual rights on the altar of technology.
In the next article, we will analyze the compatibility of these scanning systems with Article 15 of our Constitution and the case law of the ECHR, which recently reaffirmed that encryption is an insurmountable bulwark of freedom.
Follow us on Google News to receive daily updates on cybersecurity. Contact us if you would like to report news, insights or content for publication.
