Tech. Science. Culture.
We may earn a commission from links on this page

WhatsApp Says It Won't Be Scanning Your Photos for Child Abuse

WhatsApp said in a statement that Apple's latest iOS feature introduces "something very concerning into the world.”

We may earn a commission from links on this page.
Image for article titled WhatsApp Says It Won't Be Scanning Your Photos for Child Abuse
Photo: Carl Court / Staff (Getty Images)

Apple’s new tool to suss out potential child abuse in iPhone photos is already sparking controversy. On Friday, just one day after it was announced, Will Cathcart, the head of Facebook’s messaging app, WhatsApp, said that the company would decline to adopt the software on the grounds that it introduced a host of legal and privacy concerns.

“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world,” Cathcart tweeted. “People have asked if we’ll adopt this system for WhatsApp. The answer is no.”

In a series of tweets, Cathcart elaborated on those concerns, citing the ability of spyware companies governments to co-opt the software and the potential of the unvetted software to violate privacy.


“Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out,” he wrote. “Why not? How will we know how often mistakes are violating people’s privacy?”

In its announcement of the software on Thursday, Apple said that it had slated the update for a late 2021 release as part of a series of changes the company planned to implement in order to protect children from sexual predators. As Gizmodo previously reported, the proposed tool—which would use a “neural matching function” called NeuralHash to determine whether the images on a user’s device match known child sexual abuse material (CSAM) fingerprints—has already caused some amount of consternation among security experts.


In an Aug. 4 tweet thread, Matthew Green, an associate professor at Johns Hopkins Information Security Institute, warned that the tool could eventually become a precursor to “adding surveillance to encrypted messaging systems.”

“I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea,” Green tweeted. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.”


But according to Apple, Cathcart’s characterization of the software as being used to “scan” devices isn’t exactly accurate. While scanning implies a result, the company said, the new software would merely be running a comparison of any images a given user chooses to upload to iCloud using the NeuralHash tool. The results of that scan would be contained in a cryptographic safety voucher—essentially a bag of interpretable bits of data on the device—and the contents of that voucher would need to be sent out in order to be read. In other words, Apple wouldn’t be gathering any data from individual users’ photo libraries as a result of such a scan—unless they were hoarding troves of Child Sexual Abuse Material (CSAM).

According to Apple, while the potential for an erroneous reading does exist, the rate of users falsely sent in for manual review would be less than one in 1 trillion per year.