EU Parliament Kills CSAM Scanning Extension
On March 27, 2026, the European Parliament voted to let a temporary CSAM scanning regulation expire. 311 members voted against extending it. The law had exempted tech companies from EU privacy rules, allowing them to scan messages, uploads, and communications for child sexual abuse material. Without the extension, that legal cover disappeared on April 4.
This is a significant win for encrypted communications in Europe. It is also a complicated one. The vote pitted child safety advocates, law enforcement agencies, and Big Tech against digital rights groups and privacy-focused legislators. Understanding what actually happened, and what comes next, matters for anyone who cares about private messaging.
What the regulation allowed
The expired rule was a temporary derogation from the EU's ePrivacy Directive. First introduced in 2021 and extended in 2024, it gave platforms legal permission to voluntarily scan their services for known CSAM using hash-matching technology.
Hash matching works by creating digital fingerprints of known CSAM images stored in a secure database. When a user uploads or sends an image, the platform generates a hash and checks it against the database. A match triggers a report to law enforcement via the U.S.-based National Center for Missing and Exploited Children (NCMEC), which forwards tips to EU authorities as CyberTips.
Europol processed roughly 1.1 million CyberTips in 2025 sourced from this scanning. Catherine De Bolle, Europol's executive director, called the vote's impact a "serious reduction" in investigative leads and warned it would "severely impair the EU's security interests."
Why Parliament voted no
The opposition centered on a single argument: mass surveillance of private communications violates fundamental rights, regardless of the stated purpose.
Ella Jakubowska, head of policy at European Digital Rights (eDRI), put it directly: "This is actually just enabling big tech companies to scan all of our private messages, our most intimate details, all our private chats so it constitutes a really, really serious interference with our right to privacy."
She added the critical point: "It's not targeted against people that are suspected of child abuse. It's just targeting everyone, potentially all of the time."
Several factors drove the vote:
False positives. Hash-matching systems are not as accurate as proponents claim. Google flagged a father who sent photos of his child's medical condition to a doctor. His account was locked, his data was searched, and police investigated. He was cleared, but Google refused to restore his account. These are not edge cases. They are structural failures in automated scanning systems that treat every user as a suspect.
Scope creep. A scanning system built for one category of content can be expanded to others. Once the infrastructure exists to check every message against a database, the database becomes the variable. Governments can add political content, protest imagery, or copyrighted material. The EU's own proposed permanent regulation included provisions for detecting "grooming" via text analysis, which would require reading the actual content of messages, not just matching image hashes.
Encryption incompatibility. End-to-end encrypted services like Signal cannot perform server-side scanning because they never see message content in plaintext. The proposed permanent regulation included client-side scanning: checking content on the device before encryption. This is functionally equivalent to installing a backdoor. It does not matter that the message is encrypted in transit if the content was already read on your phone.
Who wanted the extension
The list is instructive.
On March 19, Google, Snapchat, Microsoft, TikTok, and Meta released a joint statement expressing "deep concern" about the regulation lapsing. They described their scanning tools as "highly effective" and claimed hash matching ensures "high-precision detection while adhering to privacy principles."
This is worth examining. These are the same companies that collect vast amounts of user data for advertising. Scanning user communications is operationally consistent with their existing business model. The cost of scanning infrastructure is marginal when you already process every message for ad targeting, content moderation, and algorithmic ranking.
German Chancellor Friedrich Merz, several European commissioners, and Europol also pushed for extension. Law enforcement's position is predictable: any tool that generates investigative leads is one they want to keep. The question Parliament answered is whether generating 1.1 million tips per year justifies scanning the communications of 450 million EU residents.
The "Chat Control" fight is not over
The expired temporary regulation was always a stopgap. Since November 2023, Parliament and national governments have been negotiating a permanent framework. Those negotiations have stalled repeatedly because of deep disagreements.
The proposed permanent regulation, commonly called "Chat Control," goes further than the expired temporary rule. It would mandate scanning rather than allowing voluntary participation. It includes provisions for client-side scanning to cover encrypted services. And it proposes text analysis for "grooming detection," which means reading message content, not just matching image hashes.
Patrick Breyer, a Pirate Party MEP and leading opponent of Chat Control, has outlined four requirements for any permanent framework: no indiscriminate scanning of all messages, explicit protection for end-to-end encryption, preservation of anonymous communication, and no age-gating that would exclude young people from platforms like WhatsApp.
None of these requirements have been accepted by the Council. The fight continues.
What this means for encrypted messaging
For users of end-to-end encrypted services, the immediate impact is positive. Without the legal framework for scanning, platforms have less cover to implement client-side scanning in the EU. Signal, which has repeatedly stated it would rather exit the EU market than compromise its encryption, faces less immediate pressure.
But the structural threat remains. The Commission's permanent regulation proposal is still on the table. National governments, particularly France and Spain, continue pushing for mandatory scanning. The argument that "something must be done" about CSAM creates persistent political pressure to pass some form of scanning mandate.
The risk is not that encryption will be broken mathematically. The risk is that device manufacturers and app developers will be legally required to scan content before it is encrypted. Apple briefly implemented exactly this for iCloud Photos in 2021 before reversing course after backlash. The technology exists. The political will is the variable.
Practical steps
Use Signal or Briar for sensitive conversations. Both are open-source, end-to-end encrypted by default, and have resisted scanning mandates. Signal operates its own infrastructure. Briar works peer-to-peer over TorThe Tor network uses onion routing to obscure IP addresses and browsing paths by relaying traffic through multiple volunteer-run nodes.Glossary →, so there is no server to compel. See our encrypted messaging comparison for detailed breakdowns.
Run GrapheneOS if you are on Android. It removes Google Play Services, which is the most likely vector for client-side scanning on Android devices. Without Play Services, no Google-mandated scanning code runs on your device.
Watch the permanent regulation negotiations. The temporary rule is dead, but the permanent proposal is not. Follow eDRI's campaign tracker and Patrick Breyer's updates for the latest on Chat Control.
Understand the false equivalence. Opposing mass scanning is not opposing child safety. Targeted investigation of suspects with judicial oversight catches criminals. Scanning every person's messages catches false positives and normalizes infrastructure that can be repurposed for political surveillance. These are not the same approach, and conflating them is a lobbying strategy.
The EU Parliament made the right call on the temporary extension. Whether it holds against the permanent regulation will depend on sustained public pressure. The infrastructure for mass message scanning exists. The question is whether it gets a legal mandate.
Related reading
Frequently Asked Questions
What did the EU Parliament vote on regarding CSAM scanning?
On March 27, 2026, the European Parliament voted 311 against extending a temporary regulation that exempted tech companies from EU privacy rules so they could scan their services for child sexual abuse material (CSAM). The regulation lapsed on April 4, 2026.
Does this mean CSAM scanning is banned in the EU?
Not exactly. The vote means the legal exemption that allowed voluntary scanning has expired. Tech companies can no longer scan private messages under this framework without violating EU privacy law. However, Parliament and national governments continue negotiating a permanent replacement regulation.
What is client-side scanning and why is it controversial?
Client-side scanning checks message content on your device before encryption. Supporters say it catches CSAM while preserving encryption. Critics point out it turns every phone into a surveillance endpoint. If the scanning database is modified or expanded, the same system could flag political speech, protest images, or any content a government decides to target.
Which messaging apps are affected by this vote?
The expired regulation applied to all interpersonal communication services operating in the EU. This includes WhatsApp, Messenger, Instagram DMs, Gmail, Snapchat, and similar platforms. End-to-end encrypted services like Signal were already unable to scan server-side, though client-side scanning proposals specifically targeted them.
How can I protect my private messages from scanning?
Use end-to-end encrypted messengers like Signal or Briar that cannot perform server-side scanning. On mobile, GrapheneOS removes Google Play Services that could facilitate client-side scanning on Android. Avoid platforms that have publicly supported scanning mandates.