Earlier this year, Apple announced it would scan photos on your iPhone and iCloud for child sexual abuse. A backlash ensued, and now it looks like the company has completely abandoned the project without telling anyone.
Apple first announced this feature in August 2021. It was designed to analyze your child’s messages, as well as your pictures, Siri, and searches. All this with the aim of detecting child pornography (CSAM) and preventing it from spreading online.
As Apple explained at the time, the feature wouldn’t store your photos anywhere. Instead, it would only collect image hashes (“fingerprints”) and compare them to known CSAM hashes. However, that wasn’t enough to stop the backlash, also from individuals and companies like Meta (oh the irony).
In the wake of the backlash, Apple said the Child Abuse Prevention System will only scan photos that have already been flagged. But then, in September, the company decided to postpone the feature film. âBased on feedback from customers, advocacy groups, researchers and others, we have decided to take extra time over the next few months to gather feedback and make improvements before releasing these features of safety of critically important children, âsaid an Apple spokesperson at the time. However, it seems the planes have changed again.
IOS 15.2 was released earlier this week, and communications security features were rolled out for messages. However, as Mac Rumors noticed, Apple has decided to delay the rollout of CSAM. According to the same source, all mention of the feature has also disappeared from Apple’s website.
You would think Apple has decided to ditch the feature altogether, given all the reviews and concerns. However, it appears that the launch was only delayed further. An Apple spokesperson told The Verge that ultimately the plans haven’t changed. Moreover, you can also find a technical summary of CSAM in PDF. So it looks like you’re about to come under Apple’s watch, but not yet.
[via Gizmodo, The Verge]