Content
The EU Commission’s proposed Child Sexual Abuse Regulation (commonly known as “Chat Control”) aims to combat the circulation of CSAM (child sexual abuse material) on digital platforms by requiring service providers to implement some sort of detection mechanism that automatically scans users’ (media) messages for both known and potential CSAM and reports detected cases to authorities.
It doesn’t matter how the EU Commission is trying to sell it – as “client-side scanning,” “upload moderation,” or “AI detection” –, Chat Control is still mass surveillance. And regardless of its technical implementation, mass surveillance is always an incredibly bad idea, for a whole plethora of reasons.
Alternative and solution to chat control act
If we come to the conclusion that there’s no other way, messaging companie willl call on fellow communication services to join them in leaving the EU in mass.