Reuters
By Stephen Nellis (Reuters) –Apple Inc on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.
Detection of child abuse image uploads sufficient to guard against false positives will trigger a human review of and report of the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion.
Apple’s new system seeks to address requests from law enforcement to help stem child sexual abuse while also respecting privacy and security practices that are a core tenet of the company’s brand. But some privacy advocates said the system could open the door to monitoring political speech or other content on iPhones.
Most other major technology providers – including Alphabet Inc’s Google, Facebook Inc, and Microsoft Corp – are already checking images against a database of known child sexual abuse imagery.
“With so many people using Apple products, these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” John Clark, chief executive of the National Center for Missing & Exploited Children, said in a statement. “The reality is that privacy and child protection can co-exist.”… Read More