Apple takes steps to combat child sexual abuse

0

Apple takes steps to combat child sexual abuse

The American apple has announced measures aimed at protecting minors from possible sexual abuse and stopping the circulation of child pornography on its services and devices. Such measures will come with Apple iOS 15, as well as MacOS 12, iPadOS 15, and WatchOS 8. 

Undoubtedly the goal is noble, but the method is not without controversy due to the sacrifices in terms of privacy, which is an important issue when we talk about a company that treats itself as an advocate for the privacy of its users. 

NeuralMatch, Apple’s tool to detect child abuse : Apple takes steps to combat child sexual abuse

At the center of these measures is  NeuralMatch , a sophisticated tool that uses artificial intelligence to scan photos stored on Apple devices related to child pornographic material.  

Similarly, the scope of this tool extends to content stored in iCloud. If content of this type is detected, the content will be reported to the relevant entities in the United States; We would be talking specifically about the National Center for Missing and Exploited Children (NCEM).  

Also, when the content is detected locally, in addition to being reported to NCEM, the photos will be evaluated by an Apple employee , who will ensure that the content truly qualifies as child pornography. 

Obviously this is where the controversies are born, but Erik Neuenschwander, Apple’s head of privacy affairs, assured that NeuralMatch is not intrusive and that only those users who store “a collection of photos of child abuse” will have problems. 

NeuralMatch, Apple's tool to detect child abuse

The second measure against child sexual abuse is taken in the communication department. In this sense, iMessage will censor sexually explicit images received through this medium , but it will also be possible to add an extra layer of security and inform parents if the minor ends up agreeing to view the photos. 

Finally, Siri and the search app will provide information to combat these situations and stay safe online. These tools will indicate where to report cases of child exploitation and will notify the user when searches related to the subject are made. 

What is the controversy then? 

In fact, all these measures fight for a just cause, and they are not exactly exclusive to Apple, as Google has taken similar measures, using hashing to identify child pornography for a long time. Even Facebook, Twitter, and Reddit make good efforts on this front, so what should those who have nothing to hide be afraid of? 

For it is feared that Apple plans to build a backdoor in its data storage system and its messaging system, not to mention the disservice that would be done to end-to-end encryption. 

By simply changing some parameters, the search can be extended to other contents and purposes . In this sense, all that would be needed to expand the narrow back door that Apple is building is an expansion of the parameters of machine learning to search for additional types of content and not only in the accounts of children, but of anyone. 

Pricacy is King - Apple

This is where the discussion is born, that is to mention the possible use that it could be given in authoritarian countries, where political dissent is not tolerated. What will it take for Apple to give in to external pressure? What will it take for the search to apply to images of violence, protests or weapons? That is the dilemma.  

It is believed that the US government will inevitably ask to expand these tools to help fight terrorism, so it would only be a matter of time before another nation does the same. 

Again, the goal is noble, the means, not so much. Only time will tell if the concerns of online freedom advocates are unfounded or if they end up being realized. 

Leave a Reply

Your email address will not be published. Required fields are marked *