Apple will soon scan and detect child abuse photos on your iPhone
Apple has revealed its plans deploy a new algorithm that can detect potential child abuse imagery on its operating systems – iOS, iPadOS, macOS and watchOS – as well as on its messaging platform iMessage.
Only applicable in the United States for now, new versions of iOS will have “new applications of cryptography to help limit the spread of CSAM (child sexual abuse material) online, while designing for user privacy.”
As per Apple, the on-device scanning will only occur for images that will be backed up to iCloud. An on-device matching process is performed for that image against the known CSAM hashes. A ‘cryptographic safety voucher’ that encodes the match will then be uploaded to iCloud alongside the image.
If an iCloud account crosses a threshold of known CSAM content, Apple will decrypt the allegedly encrypted data for a manual review, and then, of course, disable the user’s account and send a report to the National Center for Missing & Exploited Children.
Also, Apple’s voice assistant Siri is being updated to intervene when users perform searches for queries related to CSAM. Siri will now reportedly explain to such users that interest in this topic is harmful and problematic, while also providing resources to get help with such an issue.