Should you be concerned with Apple’s new child abuse detection system?

child abuse

Apple rolled out new features on Thursday, aimed at identifying and reporting child sex abuse material (CSAM) on its platforms. However, the tech giant isn’t receiving the kind of response it expected.

The new tools will be introduced with the release of the iOS 15, iPadOS 15, WatchOS 8 and MacOS Monterey software updates. The features are designed to use algorithmic scanning to search for and identify CSAM material on American user devices.

One of the tools will examine signs of CSAM in photos on Apple’s device that have been shared using iCloud, while the other tool will inspect iMessages sent to and from child accounts to stop minors from sharing or receiving messages that go against the database of child exploitation content that’s set by the National Center for Missing and Exploited Children (NCMEC).

However, security advocates are suggesting the new measures could erode our privacy.

Apple made it clear that these features are strictly for the protection of people’s privacy, as they will only issue warnings if a certain criterion is met.

Privacy experts, who are all for fighting child exploitation, are warning that Apple’s tools may open a gateway to wider uses, potentially allowing governments to spy on their citizens.

“Even if you believe Apple won’t allow these tools to be misused there’s still a lot to be concerned about,” tweeted Matthew Green, cryptographic technology professor at Johns Hopkins University.

Privacy campaigners are presenting multiple possible scenarios that can easily become a reality after Apple implements the new measures. Not only will the government have the capability to scan politically dissident material, but anti-LGBT movements can utilize the features to crackdown on sexual expression.

The Electronic Frontier Foundation (EFF), a leading nonprofit organization defending civil liberties in the digital world addressed the issue in a blog post, stating that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

“We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content,” EFF added.

In response, Apple issued a statement clarifying that they will strictly work under the guidelines of the NCMEC and other child safety organizations, refusing to work with government’s demands in adding non-CSAM images to the list.

To reassure users, the tech giant added that they have faced demands in the past to build and deploy government-mandated changes that degrade the privacy of users, but have “steadfastly refused those demands” and will continue to refuse them in the future.

However, Apple failed to address concerns regarding the accuracy of the new tools in scanning CSAM images. The scanning algorithm may turn up false positives, similarly to Tumblr’s attempt in halting sexual content in 2018, which ended up flagging its own examples of acceptable content.

While it wouldn’t be fair to knock down Apple’s efforts in trying to ensure protection for its users, the fact remains that tech giants have the power to infiltrate our privacy any second now.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Ethical Tech section to stay informed and up-to-date with our daily articles.