US-based technology giant Apple, one of the biggest dangers in the world child abuse against announced that it has taken a new and important decision. According to the statements made by the company, iPhones will now be automatically scanned to detect whether they contain child abuse images. However, the system For photos automatically uploaded to iCloud will apply.
If Apple’s new system captures an image of child abuse on an iPhone, it directly law enforcement will be notified. In this way, Apple will partially prevent child abuse. However, Apple’s security system called “CSAM” has already been used by some people. started to be criticized.
It is thought that the feature may not be just about child abuse
Apple will launch towards the end of the year. CSAM
The system, named , will include a database compiled by various child safety agencies, notably the US National Center for Missing and Exploited Children. This database will contain images of children who have been sexually abused before. The system processes photos uploaded to iCloud. will compare with the database. If similar results are reached, the legal process will automatically begin. Apple states that the error rate of the system it has developed is one in a trillion, and that no one will be subjected to any slander due to this feature.
Those who are against the feature do not believe that this feature will only detect child abuse. may not be just they think. Those who doubt that the feature will be made available by governments after a while, think that in such a case, user privacy may be put at risk. However, Apple states that there will be no problem in this regard and to ensure user privacy says it will continue.