Apple Officially Ditches Plan to Scan iCloud for Child Abuse Images


Image for article titled Apple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse Material

Photo: Anton_Ivanov (Shutterstock)

Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).

Yes, last summer, Apple announced that it would be rolling out on-device scanning—a new feature in iOS that used advanced tech to quietly sift through individual users’ photos for signs of bad material. The new feature was designed so that, should the scanner find evidence of CSAM, it would alert human technicians, who would then presumably alert the police.

The plan immediately inspired a torrential backlash from privacy and security experts, with critics arguing that the scanning feature could ultimately be re-purposed to hunt for other kinds of content. Even having such scanning capabilities in iOS was a slippery slope towards broader surveillance abuses, critics alleged, and the general consensus was that the tool could quickly become a backdoor for police.

At the time, Apple fought hard against these criticisms, but the company ultimately relented and, not long after it initially announced the new feature, it said that it would “postpone” implementation until a later date.

Now, it looks like that date will never come. On Wednesday, amidst announcements for a bevy of new iCloud security features, the company also revealed that it would not be moving forward with its plans for on-device scanning. In a statement shared with Wired magazine, Apple made it clear that it had decided to take a different route:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

Apple’s plans seemed well-intentioned. CSAM’s digital proliferation is a major problem—and experts say that it has only gotten worse in recent years. Obviously, an effort to solve this problem was a good thing. That said, the underlying technology Apple suggested using—and the surveillance dangers it posed—seems like it just wasn’t the right tool for the job.


#Apple #Officially #Ditches #Plan #Scan #iCloud #Child #Abuse #Images

mrB

Related Posts

AI-generated Seinfeld parody suspended by Twitch for transphobic content

What’s the deal with the AI-generated Seinfeld parody “Nothing, Forever?” On Sunday night, “Nothing, Forever” found itself in hot water after its AI-generated dialogue included a rant(Opens…

Best cheap Apple Watch deals February 2023

In September, Apple launched its latest batch of smartwatches, introducing the first-ever Ultra ($799) alongside the Series 8 ($399) and a new Apple Watch SE ($249). Each…

Apple execs on M2 chips, winning gamers and when to buy a Mac • TechCrunch

Apple’s M series chips were incredibly well telegraphed when they arrived in late 2021. Apple had been designing its own silicon since the A4 appeared in the…

Kingdom of the Planet of the Apes Plot Revealed: Caesar Looms

Concept art from the new Planet of the Apes movie, Kingdom of the Planet of the Apes.Image: Disney Wes Ball’s new Planet of the Apes movie, Kingdom…

The 19 Best Shows on Hulu Right Now

Netflix may have led the way for other streaming networks to create compelling original programming, but Hulu made history when it became the first streamer to win an…

TikTok For You page getting dull? Enter: Refresh

TikTok is allegedly toying with a new feature that will let you reset your For You page if it starts getting dull. When you make an account…

Leave a Reply

Your email address will not be published. Required fields are marked *