+ INFORMATION

Share on social networks!

Apple will review your photos to prevent images of child sexual abuse

Apple has created a new system that will analyze photos uploaded to the cloud for images of child sexual content. This new technology will scan the images before they are saved in iCloud to identify practices of sexual abuse of minors.

Although the new system will be automatic, the moment it finds a match or possible image suspected of abuse, an alert would go off and a person would be in charge of evaluating the image and even reporting the case to the police.

Again another attack on privacy

If last week we already talked about the risk to our privacy with the QR letters, this news is not far away. And the concern that this news is generating is alarming since this same technology could be used to search for any type of information on our phones. 

Apple's new measure will review your photos to prevent images of child abuse

Apple wants to start incorporating this new technology called NeuralHash at the end of 2021 on your IOS, watchOS, macOS and iPadOs devices. In it release They explain that they will have “new crypto applications to help limit the spread of online child sexual abuse,” while designing for user privacy. “Detecting online child sexual abuse will help Apple provide valuable information to law enforcement about collections in iCloud Photos.”

How does it work?

NeuralHash compares the photographs to a database of child sexual abuse images owned by the National Center for Missing and Exploited Children in the United States. These images are converted into numerical codes that, if they match the scanned images, would generate the alert.

In the statement we could read that also They will implement another measure for the safety of children. The Messages application will be able to detect images of sexual content and also, if the user is a child, blur them, alerting the child of the content of that photograph and the parents as well.

Siri will also be “trained” to help us with reports of sexual abuse of minors. In the event that a user wants to report a case, Siri will guide them step by step to make the report.

And what do you think of Apple's new measure? 

Does it really only seek to fight against online child sexual abuse or will it take advantage of the situation as a door to Information theft?

Subscribe to our newsletter to stay up to date with all the news

EIP International Business School informs you that the data in this form will be processed by Mainjobs Internacional Educativa y Tecnológica, SA as the person responsible for this website. The purpose of collecting and processing personal data is to respond to the query made as well as to send information about the services of the data controller. Legitimation is the consent of the interested party.
You can exercise your rights of access, rectification, limitation and deletion of data in compliance@grupomainjobs.com as well as the right to file a claim with the supervisory authority. You can consult additional and detailed information on Data Protection in the Privacy Policy that you will find in our Web page
Master Cybersecurity Professional Master

Leave a comment