Apple will not give the arm to twist with NeuralMatch before the governments

0

Apple will not give the arm to twist with NeuralMatch before the governments

Going back to yesterday’s story, we had established that there are many concerns on the part of privacy advocates regarding Apple’s new software, NeuralMatch, which is aimed at  combating sexual abuse and child pornography . As you may recall, the main argument revolves around the possibility that Apple is building a back door; which could be turned into a surveillance tool by authoritarian governments. 

However, Apple has stepped up to calm the waters and clarify the situation. In an official question and answer , the American apple clarified, among several things, that “this technology is limited to detecting CSAM (child sexual abuse material) stored in iCloud and we will not agree to the request of any government to expand it.” 

Civil liberties organizations such as the Electronic Frontier Foundation, have indicated that one of the first technologies of this type was reused to create a database of “terrorist content”, to which companies can contribute samples to ban said type of content. So what stops the Apple project from suffering the same fate? 

Apple promises that this will not be the case : Apple will not give the arm to twist with NeuralMatch before the governments

In the document published by the company, Apple argues that it has safeguards in place to prevent its systems from being used to detect anything other than images of sexual abuse. It says its list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations. Additionally, there is no automated report to the authorities, a human will review each case individually. 

Apple promises that this will not be the case

“As a result, the system is designed only to report photos that are known as CSAM in iCloud Photos,” said the American company. 

Apple indicates that it will reject the demands of any government to add images to the database that do not fall into this category. On top of that, the company takes pride in having firmly rejected demands from governments in the past to seek to implement functions that degrade user privacy. 

On the other side of the coin, there are those who argue that the latter is not entirely true. As evidence of this they mention the fact that the brand sells iPhones with FaceTime disabled in countries that do not accept encrypted phone calls; not to mention that infamous moment when China bent the knee by removing thousands of apps from the App Store. 

However, the document fails to address some concerns about the function that scans messages for sexually explicit material. Supposedly the function does not share any information with Apple or the police; but it doesn’t say how it ensures that the tool’s focus remains solely on sexually explicit images. 

Lastly, there is a concern that the algorithm will misclassify content , because as everyone will know, machine learning technology takes time to mature. 

Leave a Reply

Your email address will not be published. Required fields are marked *