Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

  • Thread starter Thread starter The Verge RSS
  • Start date Start date
T

The Verge RSS

Guest
Author: Adi Robertson

acastro_20200818_1777_epicApple_0003.0.0.jpg

Illustration by Alex Castro / The Verge

Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.

Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn...

Continue reading…

Continue reading...