On Wednesday, May 11, 2022, the EU Commission is expected to publish draft legislation on the so-called chat control. The plan is to establish an AI-based classification of all message content and images directly on our devices. This so-called "client-side scanning" would be an attack on every confidential communication.
The draft plans to examine all communication content directly on our devices and to extract it in case of suspicion. This client-side scanning would not be the first excessive and misguided surveillance method justified by the fight against child abuse.
Undoubtly the victims of child abuse need much more and better support, but chat control is an excessive approach, easy to circumvent and focuses on the wrong place. With zero chance of success in terms of the actual goal, an unprecedented monitoring tool is to be introduced.
The proposed regulation would require every device to scan every message for images of child abuse and contact between criminals and children. If such content is detected in a message, it would be routed directly to a supervisory authority or the police.
Mass scanning not only attacks confidential communications at its very foundation, but would also be ineffective: Criminals already use distribution channels that would not be affected by these scans and will easily avoid scans in the future:
The perpetrators use public file hosters (German language link) instead of the messengers targeted by the Commission – first of all because messengers are completely unsuitable for exchanging large collections of files. Before being uploaded to public hosters, the material is typically encrypted, thus rendering automated scans useless.
For this reason alone, the proposed mandatory monitoring will not prevent the further dissemination of abusive images.
Not only journalists and whistleblowers depend on trustworthy communication – it is a fundamental right and an important cornerstone of IT security for us all. For communication to be truly trustworthy, two conditions must be met:
With the secrecy of telecommunication and the fundamental right to the guarantee of the confidentiality and integrity of information technology systems (German language link), chat control suspends two fundamental rights. Users lose control over what data they share and with whom. They lose basic trust in their own devices.
So far, it is not clear who is supposed to define and control the recognition algorithms and databases. Such a non-transparent system can and will be easily expanded after its introduction. It is already foreseeable that the content rights management industry will be just as interested in the system as anti-democratic governments. It is therefore all the more frightening to see the guilelessness with which it is now to be introduced. Implementing on-device content scanning and alerting is the groundwork for an all-encompassing censorship and monitoring infrastructure that can and will be abused by whoever has control about it.
An "artificial intelligence" scanning for abusive content will also "falsely flag content as illegal (German language link). Even the smallest error rates would result in massive amounts of incorrectly "recognized" and exfiltrated messages: In Germany alone, well over half a billion messages are sent any given day (German language link). Even exceptionally "good" recognition rates would lead to the extraction of several thousand messages per day.
Of course, the likelihood of content being flagged and exfiltrated increases dramatically for intimate, completely legal and consensual images exchanged among adults and young people. Young adults can already look forward to having their age estimated by monitoring agencies. The nagging concern about whether our messages are being leaked, who is viewing them, and how safe they are from abuse there in turn will affect us all.
At the same time, the system will create mountains of flagged but irrelevant material, preventing officers from doing important investigative work. Investigating authorities are already overburdened with the data they are currently receiving (German language link). Actual successes stories from investigators fail to materialize, and materials found are not even deleted (German language link). Effectively eliminating these deficits would be the most important goal in the fight against child abuse. Instead, the Commission wants to rely on mass surveillance and the promise of salvation of "artificial intelligence."
Chat control should be rejected as a fundamentally misguided technology.