While it’s unclear when the image scans started, Apple’s chief privacy officer Jane Horvath confirmed at an event in Las Vegas this week that the company is now “utilizing some technologies to help screen for child sexual abuse material.”
Apple initially suggested it might inspect images for abuse material last year – and only this week added a disclaimer to its website acknowledging the practice – but Horvath’s remarks come as the first confirmation the company has gone ahead with the scans.
A number of tech giants, including Facebook, Twitter and Google, already employ an image-scanning tool known as PhotoDNA, which cross-checks photos with a database of known abuse images. It is unknown whether Apple’s scanning tool uses similar technology.
The move sent off alarms for critics, some disputing Apple’s sincerity in its avowed desire to crack down on crime, as well as whether the photo scans will further erode the privacy of consumers, especially given the scant detail the company has so far offered about the screening process.
“Of course everyone is for stopping child abuse, and that’s not the issue here. It’s that I’m simply not buying it,” journalist and political commentator Chadwick Moore told RT. “I don’t believe that Apple really cares about fighting crime.”
Moore said the company has been excessively vague about the scans, noting “all it says is that they can scan all your images, flip through all your data, and look for potentially illegal activity, including child pornography.”
What does that mean? That’s terrifying language. What else are they looking for? If you’re smoking a joint, is that next? I don’t trust these companies, I just think it’s ever-encroaching more and more into our privacy, into owning our data.
Tech expert and privacy advocate Bill Mew said the critics are wrong, however, arguing that the new measure may be less intrusive than it appears given Apple’s technological capabilities.
“The technology that is in use is really clever,” Mew told RT. “It doesn’t necessarily mean that Apple can actually see your photos,” as the company can “sift through these images and test them against a set of known ‘fingerprints’ … without actually de-encrypting the images themselves.”
Therefore, there’s little to fear on the privacy front.
While Apple has gone to bat for data privacy in the past – on several occasions tussling with law enforcement agencies seeking access to one of the company’s devices – its track record on the question in somewhat mixed. In August, it was revealed that company contractors were granted access to customers’ private conversations through Apple’s AI assistant program, Siri, in an effort to “grade” its performance. Several other tech giants have come under fire for similar intrusions, with both Google and Amazon’s home assistant devices also found to surreptitiously record users.
Like this story? Share it with a friend!