Apple Officially Ditches Plan to Scan iCloud for Child Abuse Images

[ad_1]

Image for article titled Apple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse Material

Photo: Anton_Ivanov (Shutterstock)

Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).

Yes, last summer, Apple announced that it would be rolling out on-device scanning—a new feature in iOS that used advanced tech to quietly sift through individual users’ photos for signs of bad material. The new feature was designed so that, should the scanner find evidence of CSAM, it would alert human technicians, who would then presumably alert the police.

The plan immediately inspired a torrential backlash from privacy and security experts, with critics arguing that the scanning feature could ultimately be re-purposed to hunt for other kinds of content. Even having such scanning capabilities in iOS was a slippery slope towards broader surveillance abuses, critics alleged, and the general consensus was that the tool could quickly become a backdoor for police.

At the time, Apple fought hard against these criticisms, but the company ultimately relented and, not long after it initially announced the new feature, it said that it would “postpone” implementation until a later date.

Now, it looks like that date will never come. On Wednesday, amidst announcements for a bevy of new iCloud security features, the company also revealed that it would not be moving forward with its plans for on-device scanning. In a statement shared with Wired magazine, Apple made it clear that it had decided to take a different route:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

Apple’s plans seemed well-intentioned. CSAM’s digital proliferation is a major problem—and experts say that it has only gotten worse in recent years. Obviously, an effort to solve this problem was a good thing. That said, the underlying technology Apple suggested using—and the surveillance dangers it posed—seems like it just wasn’t the right tool for the job.

[ad_2]
#Apple #Officially #Ditches #Plan #Scan #iCloud #Child #Abuse #Images

mrB

Related Posts

Marvel’s Blade Movie Delayed by Writer’s Strike

[ad_1] Marvel’s vampire hunter Blade is a fierce warrior but he may have finally met his match: labor unions. The upcoming, long-in-development reboot of the Marvel franchise…

How to Watch the Coronation of King Charles III Live

[ad_1] King Charles III officially shed his princedom when Queen Elizabeth II died, and the British royal’s new position will be formalized on May 6 in a coronation…

‘Quordle’ today: See each ‘Quordle’ answer and hints for May 6

[ad_1] If Quordle is a little too challenging today, you’ve come to the right place for hints. There aren’t just hints here, but the whole Quordle solution….

How to use a passkey instead of a password to sign into your Google account

[ad_1] Passwords have always been a necessary evil, giving you the choice of either using one that is too simple (so you can easily remember it) or…

Amazon quietly acquired audio content discovery engine Snackable AI to boost its podcast projects

[ad_1] Amazon quietly acquired New York-based audio content discovery engine Snackable AI last December to boost its podcast features, as first reported by New York Post. The…

Warhammer 40K’s New Tyranid Screamer-Killer Is a Great Update

[ad_1] A new edition of Warhammer 40K means new models—and for some of the 40-year-old wargaming franchise’s creatures and characters, that means updates they’ve not had in…

Leave a Reply

Your email address will not be published. Required fields are marked *