Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

Illustration by Alex Castro / The Verge

Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.

Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn...

Continue reading…



from The Verge - All Posts https://ift.tt/3iJ6OBK

SUBSCRIBE TO OUR NEWSLETTER

Related Posts :

“Work hard in silence, let your success be your noise"

0 Response to "Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears"

Post a Comment

ad

Search Your Job