Apple recently announced a new security system. This new system, which will be used to prevent child abuse, will be used by scanning photos uploaded to iCloud and comparing them with child abuse images. abusive landscapes included reporting it to the authorities in case it happened.
However, in a statement made recently, Apple stated that only ‘necessary’ individuals will be reported. This brought with it other discussions from a different angle. The system, which was introduced to serve an extremely valuable purpose, has nevertheless become the object of criticism arrows since the day it was introduced. Valuable names from the world of technology used the system for different purposes. in a form that will harm individual privacy and security thinks it’s available. Apple, on the other hand, has been defending itself against these pronunciations from the moment the discussions started.
‘Misunderstood how the system works’:
The last name to make a statement on the matter was Craig Federighi, the manager of the Apple software team. Federighi, in an interview with The Wall Street Journal on the subject, stated that the criticisms of the system is about being misunderstood mentioned.
In his interview, Federighi, in the broadest summary, said how sensitive Apple is about the privacy of personal data, that the privacy of all personal data is under Apple’s guarantee, that the system is protected from abuse said.
In addition, the system will not scan all photos uploaded to iCloud; In fact, he emphasized once again that the process to be carried out consists of comparing the codes of the images with a database containing images containing child abuse, rather than scanning the photos. Here ‘scanning photos’
It was also mentioned in previous Apple statements that the point was a misunderstanding.
Federighi’s statement on the situation If you look at other cloud services, they scanned all the photos one by one and you’ll see them analyzed; we wanted to find a way to scan people’s photos without looking at them. It’s not a scan asking a question like ‘Is there a photo of your child in the bathtub?’ or ‘Is there any pornographic images?’; only reasonable by searching for fingerprints of child porn images mapping”.