[ad_1]
Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).
Yes, last summer, Apple announced that it would be rolling out on-device scanning—a new feature in iOS that used advanced tech to quietly sift through individual users’ photos for signs of bad material. The new feature was designed so that, should the scanner find evidence of CSAM, it would alert human technicians, who would then presumably alert the police.
The plan immediately inspired a torrential backlash from privacy and security experts, with critics arguing that the scanning feature could ultimately be re-purposed to hunt for other kinds of content. Even having such scanning capabilities in iOS was a slippery slope towards broader surveillance abuses, critics alleged, and the general consensus was that the tool could quickly become a backdoor for police.
At the time, Apple fought hard against these criticisms, but the company ultimately relented and, not long after it initially announced the new feature, it said that it would “postpone” implementation until a later date.
Now, it looks like that date will never come. On Wednesday, amidst announcements for a bevy of new iCloud security features, the company also revealed that it would not be moving forward with its plans for on-device scanning. In a statement shared with Wired magazine, Apple made it clear that it had decided to take a different route:
After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
Apple’s plans seemed well-intentioned. CSAM’s digital proliferation is a major problem—and experts say that it has only gotten worse in recent years. Obviously, an effort to solve this problem was a good thing. That said, the underlying technology Apple suggested using—and the surveillance dangers it posed—seems like it just wasn’t the right tool for the job.
[ad_2]
#Apple #Officially #Ditches #Plan #Scan #iCloud #Child #Abuse #Images