Apple regrets confusion over ‘iPhone scanning’

0 44

Apple says its announcement of automated tools to detect child sexual abuse on the iPhone and iPad was “jumbled pretty badly”.

On 5 August, the company revealed new image detection software that can alert Apple if known illegal images are uploaded to its iCloud storage.

Privacy groups criticised the news, with some saying Apple had created a security backdoor in its software.

The company says its announcement had been widely “misunderstood”.

“We wish that this had come out a little more clearly for everyone,” said Apple software chief Craig Federighi, in an interview with the Wall Street Journal.

He said that – in hindsight – introducing two features at the same time was “a recipe for this kind of confusion”.

What are the new tools?
Apple announced two new tools designed to protect children. They will be deployed in the US first.

Image detection

The first tool can identify known child sex abuse material (CSAM) when a user uploads photos to iCloud storage.

The US National Center for Missing and Exploited Children (NCMEC) maintains a database of known illegal child abuse images. It stores them as hashes – a digital “fingerprint” of the illegal material.

Cloud service providers such as Facebook, Google and Microsoft, already check images against these hashes to make sure people are not sharing CSAM.

Agencies

You might also like