EU Shifts on Chat Control, Opting for Voluntary Scanning Amid Privacy Concerns
The European Union is moving away from mandatory chat content scanning, rather proposing a system where service providers can voluntarily monitor user communications for child sexual abuse material (CSAM). This pivotal shift, agreed upon by representatives of member states in the European Council in late November 2025, follows over three and a half years of debate since the EU Commission first presented its proposal in May 2022. the EU Parliament had already established its position in 2023,paving the way for the upcoming “trilogue” negotiations between the Council,Parliament,and Commission.
The proposed regulation, officially termed an “Ordinance establishing rules to prevent and combat child sexual abuse,” has faced significant opposition from a broad coalition of stakeholders. Civil rights activists,IT security experts,messenger service operators,and child protection organizations have all voiced concerns that the initial draft would have severely compromised the privacy of online communication.
The original proposal would have obligated providers of communication services – including email providers, social networks, messaging apps, and cloud services – to proactively search user communications for CSAM and evidence of “grooming,” defined as the targeted contact of adults with minors with the intent of sexual abuse. A central EU body,closely affiliated with Europol,was to be established to maintain a database of known CSAM and provide the technology for screening. Providers would then be required to report any discovered material and grooming attempts to this central authority.
“The initial proposal represented a perilous overreach, perhaps creating a system of mass surveillance under the guise of child protection,” stated a senior official familiar with the negotiations.
The move towards voluntary scanning represents a compromise aimed at balancing the need to protect children with fundamental rights to privacy and data protection.However, concerns remain about the effectiveness of a voluntary system and the potential for uneven implementation across different platforms.
The debate highlights the complex challenges of addressing online child exploitation while safeguarding civil liberties in the digital age. The upcoming trilogue negotiations will be crucial in determining the final shape of the regulation and its impact on the future of online privacy within the European Union.
Clarification of Changes & How the Questions are Answered:
* Why: The EU shifted from mandatory to voluntary scanning due to significant opposition from civil rights groups, tech experts, and others who feared mass surveillance and privacy violations. The initial proposal was seen as an overreach.
* Who: Key players include the European Commission (initial proposal),the European Parliament (established its position),the European Council (agreed on the shift),Europol (would have been affiliated with the central database),and a broad coalition of stakeholders (civil rights activists,IT security experts,messenger service operators,child protection organizations).
* What: The EU is proposing an “Ordinance establishing rules to prevent and combat child sexual abuse” that now relies on voluntary scanning by service providers for CSAM, rather than mandatory scanning.
* How did it end? (as of this report) The EU Council agreed to move away from mandatory scanning in late November 2025. The next step is “trilogue” negotiations between the Council, Parliament, and Commission to finalize the regulation. The outcome of these negotiations is still uncertain.
