Apple Creates Feature to Detect Child Abuse Content

 


Apple created a special feature to detect child abuse by scanning photos in iCloud. Later the contents of the disturbance found will be reported to the enforcement.
The system will detect child sexual abuse material (CSAM) using a process called hashing, in which a picture will be converted into a unique number that represents the picture.



Apple began testing the system this week, but only to a limited extent in the United States. Eventually the system will be part of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.





This function works by matching a picture hash with a hash database provided by the National Center for Missing and Exploited Children (NCMEC). This matching process will be done on the user’s iPhone, not in the cloud.



If Apple detects some photos that violate the rules in an iCloud account, the system will upload a file that will allow Apple to decrypt and view the photos in that account. The photo was then manually reviewed by reviewers to confirm whether there was a contest or not.



Apple says that they can only review photos that match the content already in the database. Therefore, they will not track the photo of the child during the bath taken by the parents because it is not in the NCMEC database.



If the person reviewing the photo manually finds a match between the photo and the NCMEC database, Apple will deactivate the user’s iCloud account and send a report to NCMEC or to law enforcement if needed.



An Apple spokesperson says that users can ask Apple if their iCloud account is inadvertently deactivated.



The system can only track photos uploaded to iCloud. Photos or other pictures that are on the device but have not been uploaded to iCloud will not be detectable by the system.



Although not yet released, peace researchers have expressed concern that the feature could be used to track other photos, such as photos during political demonstrations.



Apple says its system is only designed to be used to track photos that are in NCMEC databases or other child safety organizations, and that cryptography cannot be used for other purposes.



Apple also guarantees that this feature will not violate user privacy. The iPhone maker says it has shown its system to cryptographers to make sure it can detect illegal child exploitation without violating user privacy.



On the same occasion, Apple also unveiled a series of other features aimed at protecting children from predators. For one, Apple will use machine learning to blur pictures that may contain nude content on a child’s iPhone, and parents can choose to receive an alert if a child under 13 receives sexual content on iMessage.
Previous Post Next Post

Contact Form