Apple delays child abuse photo scanning planned for iOS 15
Date:
Fri, 03 Sep 2021 19:36:44 +0000
Description:
Apple has delayed its CSAM-scanning policies for months, meaning it may not come out until later in 2021 or 2022.
FULL STORY ======================================================================
Apple announced a new Child Safety policy to automatically scan user photos for child sexual assault material (CSAM) last month, spurring an outcry from privacy advocates and consumers about privacy rights violations and potential government exploitation. Now Apple is delaying the rollout of the tech to solicit feedback over the coming months before its full release.
Apple previously planned to include its CSAM-scanning tech and an
accompanying optional policy to screen sexual content in iMessages for youth in iOS 15 and iPadOS 15, which are expected to launch alongside the iPhone 13 (rumored to be unveiled on September 14). It would have gone live in the US, with no stated plans for a global rollout. iHeres Apples full statement on
the delay, per TechCrunch :
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Shortly after introducing the new policies in early August via a blog post , Apple followed up with a multi-page FAQ giving detailed explanations about
how both the CSAM scanning and youth iMessage screening would work.
Apple planned to use its so-called NeuralHash tech to automatically scan photos to see if they matched hashes of known CSAM material. The tech only scanned images as they were being uploaded to iCloud (which is encrypted).
But the potential for governments to harness the automatic photo-scanning policy for their own uses had alarmed privacy advocates and industry groups the Electronic Frontier Foundation (EFF) criticized the company for building any kind of backdoor into user data, while the Center for Democracy and Technology (CDT) amassed a coalition decrying how such photo scanning could
be abused by governments searching for objectionable material.
The CDT also laid out how another policy Apple planned to roll out alongside CSAM photo scanning an optional feature in iMessage that blurs images with sexual content sent to users under 13 years old and notifies parents linked
to the same family account could threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly
at risk.
Finally, Apple was also going to enable Siri and Search to give more helpful resources for users asking to report CSAM, as well as intervening with warnings and supportive resources when users search for CSAM-related
material. Its unclear if this will also be delayed. Analysis: a step back for Apple, a step forward for privacy
The groups and individuals objecting to Apples new policy have criticized the tech giants methods, not its intent. In addition to opposing how it would violate user privacy and open a backdoor for government exploitation, they critiqued the potential for false positives with the CSAM scanning itself.
For instance, Apple outlined that its employees wouldnt see any images uploaded to iCloud that had been automatically scanned unless it passed a
CSAM hash threshold in other words, that an images hash (a digital fingerprint of letters and numbers) found a match in a database of known
CSAM.
While hash matching is a method used by, for instance, Microsoft for its PhotoDNA tech, website security company CloudFlare , and anti-child sex trafficking nonprofit Thorn , security researchers reportedly replicated Apples NeuralHash code and were able to generate a hash collision where two visibly different images were able to produce the same hash, according to TechCrunch .
While we wont know the true efficacy of Apples Child Safety protocols until they debut, it seems like Apple is taking the criticism and concerns
seriously enough to take some months to refine its approach, meaning we may not see it roll out until the end of 2021 or 2022. Apple has been scanning iCloud Mail for CSAM since 2019
======================================================================
Link to news story:
https://www.techradar.com/news/apple-delays-child-abuse-photo-scanning-planned -for-ios-15/
--- Mystic BBS v1.12 A47 (Linux/64)
* Origin: tqwNet Technology News (1337:1/100)