Europe wants stringent laws in place so that chat apps can scan for messages of users to prevent child abuse. The European Commission has tabled this as new legislation to thwart risks posed by social media platforms.
If the bill passes in the coming weeks, app platforms like WhatsApp and Telegram will be forced to detect, report and remove child sexual abuse material. Such measures will severely hamper the encryption standards put in place by different platforms. Asking these platforms to identify content based on their nature is a grey area that was hotly debated when Apple announced a similar program for iPhone users.
“Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards,” it added.
The European Commission wants to ensure the data is not passed on to the law enforcement agencies unless deemed necessary. But the whole prospect of breaking encryption to make this work is undoubtedly going to be a concern for privacy experts.
The Commission in its pitch for the legislation says the new rules will help rescue children from further abuse, prevent material from reappearing online, and bring offenders to justice.
The EU body also highlighted its reasons for pushing through with such a ruling. It pointed out that 85 million pictures and videos depicting child sexual abuse were reported worldwide in 2021. And the Internet Watch foundation noted a 64 per cent increase in reports of confirmed child sexual abuse last year.
The Commission believes its hands have been forced by the platforms that have failed to provide adequate protection to the children. The EU called for “clear rules” “with robust conditions and safeguards” to effectively address the misuse of online services for the purposes of child sexual abuse.
Considering Apple was forced to withdraw its own scanning program after backlash, it remains to be seen if the EU reacts in a similar manner or goes ahead with its plans.