Digital security experts are concerned about this decision.
Apple will test the iCloud of its US customers for images of child sexual abuse, reports citing
Before the image is saved to iCloud, the technology will search for matches with already known child pornography photos.
The company has assured that the privacy of the owners of iPhones and iPads will not be affected, since the technology will not scan photos, but only match their digital prints with images from a database provided by Apple by the National Center for Missing and Exploited Children (NCMEC).
Accounts will only be manually flagged and checked if the number of suspicious photos crosses a certain threshold. If the moderators confirm the match, the account will be disabled, and the user's data will be transferred to law enforcement agencies.
Digital security experts have already raised concerns that the technology could be extended to scan phones for prohibited content or even political speech.
“Such a tool can help find child pornography on people's phones. But imagine what will happen when he falls into the hands of an authoritarian government, «- wrote Matthew Green, associate professor at the Johns Hopkins Institute for Information Security on Twitter.
“This is a very bad idea, because it will lead to massive surveillance of our phones and laptops,” said Ross Anderson, a professor at the University of Cambridge.