Apple's new child protection image scanning sparks privacy concerns

Apple's new program for scanning images sent to iMessage is designed to scan images on all Apple devices to detect child abuse images. Whilst laudable in its aims to protect the vulnerable, it is arguably a step back from the company's previous stance of supporting privacy and security of encrypted messages.

Whilst Apple is working with authorities to address the scourge of child exploitation and abuse, observers are concerned that the company has created an infrastructure that is all too easy to put to greater surveillance and censorship.

For decades, governments all over the world have been seeking access and control over encrypted messages, refusing to accept no for an answer when faced with pushback from security analysts who argue that those goals are incompatible with the intention of strong encryption and to "figure it out". This program is initially limited to just the United States and the US Government have not been shy about seeking access to encrypted communications. The United States has been seen to frequently apply pressure to companies to make it easier to obtain data using warrants that require them to 'voluntarily' turn data over and to create "backdoors" into the systems to collect this data. The US constitution resists warrantless and baseless mass surveillance, thanks to the fourth amendment.

However, the concern is that when this program leaves the limitations of the US territory, this protection may not necessarily be replicated in every jurisdiction, and could be abused by other governments, for their own means. Right now in China, the WeChat service already monitors images and files shared by users, and uses them to train censorship algorithms. When a message is sent from one WeChat user to another, it passes through a server managed by Tencent (WeChat’s parent company) that detects if the message includes blacklisted keywords before a message is sent to the recipient. This is the exact type of client-side scanning system built for Apple's CSAM, which could be repurposed for censorship and political persecution to varying degrees depending on each government's motivations.

Despite Apple's good intentions, they have now potentially paved the way to enable and reinforce arguments that paternalistic democracies may justify the erosion of citizen privacy on a mass scale without challenge.

Author: Jeff Pullen, Information Security Lead, Kafico

3 views0 comments

Recent Posts

See All
Subscribe to Site

Thanks for submitting!