New Clarification on Scanning Photos from Apple 1
0 0
0 0
Read Time:1 Minute, 56 Second

Apple, which has been struggling with the security of its users, last week to prevent child abuse announced its new feature CSAM. This feature will scan photos uploaded to the iCloud system since the last months of this year, and when it detects a child abuse photo, it will blur the photo and send a notification to the user.

Although Apple emphasizes that this system will only work on photos that trigger child abuse, people do not have access to the private lives of Apple’s users. that it could be included in a much more comprehensive form hectic. In addition, Apple is expected to report this to the government as soon as it detects a child abuser. in the direction that such an agreement will not be reached a new explanation has come.

User data will not be shared with the government for whatever reason

New Clarification on Scanning Photos from Apple 2

In dozens of countries, including the United States, there is a sufficient penalty by law for possessing photographs or images of child abuse. After this step taken by Apple to prevent child abuse, the government sharing information of users found to be involved in child abuse expected. However, Apple states that it will not be able to disclose its users, regardless of the reason, in the CSAM system, which will be implemented only within the US borders for now.

A random photo of child abuse uploaded to iCloud is detected by Apple’s CSAM system, and the image is first blurred and a notification is sent to the user. Apple is starting an investigation on users who receive multiple notifications and only persons he deems necessaryReports to the National Center for Missing and Exploited Children.

The fact that the event is completely in Apple’s hands, of course, worries many people and especially the states. Because states want to detect if there are people who are prone to child abuse within their borders and to take the necessary processes. In this context to the government couldn’t help, stating that Apple, in line with the demands from the government, stated that they will not expand the CSAM system under any circumstances. Apple, this system was developed with the sole purpose of detecting photographs of child abuse., stated that whatever the reason, user data will not be shared with the government in any way.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.