Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.
Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn...
ClipDrop finally makes AR practical
ClipDrop, a new app that lets your phone’s camera quickly grab objects from your environment and place them into desktop apps, is now avail…...
0 Response to "Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears"
Post a Comment