Apple To Start Scanning iPhone Photos As Part Of New Child Safety Measures

Apple To Start Scanning iPhone Photos As Part Of New Child Safety Measures

Contents

Apple’s new child safety measures include a tool that performs on-device scans to identify images before then reporting to authorities.

You Are Reading :[thien_display_title]

Apple To Start Scanning iPhone Photos As Part Of New Child Safety Measures

Apple is introducing new tech on iPhones and iPads that will perform an on-device scan to look for Child Sexual Abuse Material (CSAM) images, and will accordingly report them to the National Center for Missing and Exploited Children (NCMEC) to protect children. Apple has historically taken a hardline approach when it comes to peeping into an iPhone and going through the data stored on it. The company has even evaded pressure from law enforcement authorities to give them backdoor access and keeps strengthening the core security firewalls at the OS level to thwart break-in attempts by sophisticated unlocking tools like GrayKey.

The company has lately started promoting its pro-privacy stance even more vigorously, which includes everything from sending notifications about apps that could potentially be a security threat to requiring apps to explicitly ask users to grant certain data access and tracking permissions. In fact, the company has even made it mandatory for developers to disclose, in detail, all the data that their apps collect and how it is shared via nutrition labels. Bundling the whole push as part of the App Tracking Transparency initiative, Apple is trying to create the image of an ecosystem where users get to choose between keeping their digital footprint to a bare minimum or allow tracking to see personalized ads. It’s all about privacy and choice, or so says Apple.

See also  Seinfeld 10 Jokes That Have Aged Poorly

That’s what makes Apple’s latest child safety measure a tad controversial, despite all the promises of offering to keep the privacy aspect intact. Apple’s new system will be baked into iOS and iPadOS, and it will only perform its child abuse imagery scan on pictures stored in iCloud Photos. However, the image scanning itself won’t happen in the cloud. Instead, it will be an on-device process. Before an image is stored in iCloud, an on-device scan will be done against a set of CSAM image hashes created by NCMEC and a handful of other child safety organizations. This entire database of CSAM image hashes is turned into an unreadable set of instructions using cryptography and is stored on the device itself. The CSAM detection tool is due to arrive with the iOS 15 and iPadOS 15 updates that are set for release later this year

Apple Assures User Privacy Remains Intact

If a match happens against the CSAM image markers, the result is instantly converted into what Apple calls a cryptographic safety voucher, and it is then uploaded to the cloud. This is where another process, threshold secret sharing, kicks into action. When the number of matching images that have been flagged as potentially harmful reaches a threshold of known CSAM content, Apple is able to interpret the contents and subsequently perform a human review to ensure that those images are indeed in violation of CSAM guidelines. Once the confirmation is done, Apple will take punitive action such as disabling the user’s account and sending a report to law enforcement authorities. Of course, users will have the option to file an appeal if they feel wrong punitive action has been taken against them.

See also  Bachelor In Paradise Mari Pepin & Kenny Braasch Already Planning Wedding

Yes, there are privacy concerns, but Apple has tried to answer a few of the burning questions in its technical paper for CSAM detection. Apple can’t read the images that do not match with the CSAM hashes for disturbing content. The company won’t be able to access the metadata or reverse engineer the derivatives of a flagged image, until and unless the number of CSAM pictures reaches a threshold, which is when the human review process kicks in for a more accurate assessment of the flagged media. Apple says the accuracy of its CSAM detection system is extremely high, and the human review process further adds to the credibility aspect. Moreover, users won’t be able to identify which image has been flagged by Apple’s detection system, neither will they be able to access the database of potentially harmful pictures.

Link Source : https://screenrant.com/apple-iphone-scanning-photos-csam-content-icloud-flags-confirmed/

Movies -