Take a fresh look at your lifestyle.

Apple backs down and paralyzes its controversial iPhone and iCloud image tracking system against child pornography

After numerous criticisms, Apple has decided to rectify and rdelay the implementation of your image scanning system on iPhones and iCloud to fight against child exploitation. Through a statement sent to the media, Apple reports that “based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time in the coming months to collect input and make improvements before publishing. these critically important child safety benefits. “

The system was announced in early August, explaining its plan to implement a system to protect minors from abusers using digital tools. A scan that identified whether the user was uploading images of child pornography or related illegal content to iCloud. A method criticized by important questions regarding the privacy that it generated.

Three measures are applied in the Apple document describing this system. On the one hand “machine learning to warn of sensitive content while posts of private communications remain unreadable by Apple, “in second place with iCloud Photos “New applications of cryptography will be used on iOS and iPadOS to help limit the spread of child pornography” and finally functions of Siri and Search to “intervene when users try to search for CSAM-related topics.”


Criticisms such as that of the Electronic Frontier Foundation (EFF) claimed that the company was “opening the door to wider abuses” and warned that it was almost impossible to create a local scanning system that only works with explicit images sent or received by children but not put the privacy of other images at risk.

Apple and the fallacy of privacy: how protecting it is the perfect excuse to earn more and more money with your products and services

The new system it was scheduled to go into operation later this year. At the moment Apple has not informed when it will implement it or what changes are expected, although it has updated the support page indicating the delay.

Image scanning to combat child pornography is a widespread practice in various technology companies, from Facebook to Google with Drive.