• Apple Child Safety photo scanning: what you need to know

    From TechnologyDaily@1337:1/100 to All on Sat Aug 7 20:15:03 2021
    Apple Child Safety photo scanning: what you need to know

    Date:
    Sat, 07 Aug 2021 19:00:50 +0000

    Description:
    Apples Child Safety initiative will automatically scan phones for CSAM content, but how will it affect you? Read on for details.

    FULL STORY ======================================================================

    Apple announced that it would be enacting a new protocol: automatically scanning iPhones and iPads to check user photos for child sexual assault material (CSAM). The company is doing this to limit the spread of CSAM, but also adding other features to protect children from predators who use communication tools to recruit and exploit them, Apple explained in a blog post . For now, the features will only be available in the US.

    Apple will institute a new feature in iOS 15 and iPadOS 15 (both expected to launch in the next couple months) that will automatically scan images on a users device to see if they match previously-identified CSAM content, which
    is identified by unique hashes (e.g. a set of numbers consistent between duplicate images, like a digital fingerprint).

    Checking hashes is a common method for detecting CSAM that website security company CloudFare instituted in 2019 and used by the anti-child sex trafficking nonprofit Thorn, the organization co-founded by Ashton Kutcher
    and Demi More.

    In addition, Apple has added two systems parents can optionally enable for children in their family network: first, on-device analysis in the Messages app that scans incoming and outgoing photos for material that might be sexually explicit, which will be blurred by default, and an optional setting can inform account-linked parents if the content is viewed.

    Apple is also enabling Siri and Search to surface helpful resources if a user asks about reporting CSAM; both will also intervene when users search queries relating to CSAM, informing the searcher of the materials harmful potential and pointing toward resources to get help.

    Thats an overview of how, by Apples own description, it will integrate software to track CSAM and help protect children from predation by
    intervening when they receive (and send) potentially inappropriate photos.
    But the prospect of Apple automatically scanning your material has already raised concerns from tech experts and privacy advocates well dive into that below. (Image credit: Apple) Will this affect me?

    If you do not have photos with CSAM on your iPhone or iPad, nothing will change for you.

    If you do not make a Siri inquiry or online search related to CSAM, nothing will change for you.

    If your iPhone or iPads account is set up with a family in iCloud and your device is designated as a child in that network, you will see warnings and blurred photos should you receive sexually explicit photos. If your device isnt linked to a family network as belonging to a child, nothing will change for you.

    Lastly, your device wont get any of these features if you dont upgrade to iOS 15, iPadOS 15, or macOS Monterey. (The latter will presumably scan iCloud photos for CSAM, but its unclear if the Messages intervention for sexually explicit photos will also happen when macOS Monterey users use the app.)

    These updates are only coming to users in the US, and its unclear when (or
    if) theyll be expanded elsewhere but given Apple is positioning these as protective measures, wed be surprised if they didnt extend it to users in other countries. (Image credit: Apple) Why is Apple doing this?

    From a moral perspective, Apple is simply empowering parents to protect their children and perform a societal service by curbing CSAM. As the company
    stated in its blog post, this program is ambitious, and protecting children
    is an important responsibility.

    Apple has repeatedly championed the privacy features of its devices, and
    backs that up with measures like maximizing on-device analysis (rather than uploading data to company servers in the cloud) and secure end-to-end encrypted communications, as well as initiatives like App Tracking Transparency that debuted in iOS 14.5.

    But Apple has also been on the receiving end of plenty of lawsuits over the years that have seemingly pushed the company to greater privacy protections for instance, a consumer rights advocate in the EU sued the tech giant in November 2020 over Apples practice of assigning each iPhone an Identifier for Advertisers (IDFA) to track users across apps, per The Guardian . This may have nudged Apple to give consumers more control with App Tracking Transparency, or at least aligned with the companys actions in progress.

    TechRadar couldnt find a particular lawsuit that would have pressured Apple
    to institute these changes, but its entirely possible that the company is proactively protecting itself by giving younger users more self-protection tools as well as eliminating CSAM on its own iCloud servers and iPhones in general all of which could conceivably limit Apples liability in the future.

    But if you can remove CSAM material, why wouldnt you? (Image credit: Apple) What do security researchers think?

    Soon after Apple introduced its new initiatives, security experts and privacy advocates spoke up in alarm not, of course, to defend using CSAM but out of concern for Apples methods in detecting it on user devices.

    Because the CSAM-scanning feature does not seem to be optional it will
    almost certainly be included in iOS 15 by default, and once downloaded, inextricable from the operating system. From there, it automatically scans a users photos on their device before theyre uploaded to an iCloud account if
    a certain amount of a photo matches those CSAM hashes during a scan, Apple manually reviews the flagged image and, if they determine it to be valid
    CSAM, the users account is shut down and their info is passed along to the National Center for Missing and Exploited Children (NCMEC), which
    collaborates with law enforcement.

    Apple is being very careful to keep user data encrypted and unreadable by company employees unless it breaches a threshold of similarity with known CSAM. And per Apple, the threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

    But its the automatic scanning that has privacy advocates up in arms. A backdoor is a backdoor, digital privacy nonprofit Electronic Frontier Foundation (EFF) wrote in its blog post responding to Apples initiative, reasoning that even adding this auto-scanning tech was opening the door to potentially broader abuses of access:

    All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just childrens, but anyones accounts. Thats not a slippery slope; thats a fully built system just waiting for external pressure to make the slightest change, the EFF wrote, pointing to laws passed in other countries that require platforms to scan user content, like India s recent 2021 rules.

    Others in the tech industry have likewise pushed back against Apples auto-scanning initiative, including Will Cathcart, head of the Facebook-owned WhatsApp messaging service. In a Twitter thread, he pointed to WhatsApps practice of making it easier for users to flag CSAM, which he claimed led the service to report over 400,000 cases to NCMEC last year, all without breaking encryption. This is an Apple built and operated surveillance system that
    could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable. August 6, 2021 See more

    In fairness, Facebook has been trying to get around Apple's App Tracking Transparency: after being forced to disclose how much user data its mobile
    app (and WhatsApp's app) access, Facebook has tried prompting users to allow that access while criticizing Apple for App Tracking Transparency's harm to small businesses (and, presumably, Facebook) relying on that advertising income.

    Other tech experts are waiting for Apple to give more information before they fully side with the EFFs view.

    The EFF and other privacy advocates' concern around misuse by authoritarian regimes may be scarily on point or an overreaction -- Apple needs to provide more implementation details, Avi Greengart, founder of tech research and analysis firm Techsponential , told TechRadar via Twitter message. However,
    as a parent, I do like the idea that iMessage will flag underage sexting before sending; anything that even temporarily slows the process down and gives kids a chance to think about consequences is a good thing. Stay on top of tech news with the TechRadar newsletter



    ======================================================================
    Link to news story: https://www.techradar.com/news/apple-child-safety-photo-scanning-what-you-need -to-know/


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)