The Future Is Here
We may earn a commission from links on this page

WhatsApp Moderators Can Read Your Messages

WhatsApp isn't the impenetrable private messaging service Facebook likes to claim, ProPublica finds.

We may earn a commission from links on this page.
Image for article titled WhatsApp Moderators Can Read Your Messages
Photo: Francisco Seco (AP)

Facebook planted its privacy flag on WhatsApp, the end-to-end encrypted messaging service which Facebook can’t spy on. In a 2018 Senate hearing, Mark Zuckerberg stated unequivocally that “we don’t see any of the content in WhatsApp, it’s fully encrypted.” Today, upon opening the app, a privacy policy and ToS update reads: “We can’t read or listen to your personal conversations, as they are end-to-end encrypted. [emphasis theirs] This will never change.”

That’s simply not true, a new ProPublica report on WhatsApp’s content moderation system finds. We knew that WhatsApp moderators exist; that WhatsApp hands over metadata to law enforcement; and that the company has long shared user data amongst its ecosystem of data-thirsty apps. This report gives a clearer picture of the practices which, until now, Facebook has deliberately obscured in its attempt to sell users on a privacy-oriented platform. WhatsApp can read some of your messages if the recipient reports them.


This leads to a lot of confusion about what the company means when it says “end-to-end encryption”—which by definition means that only the recipient and sender possess digital tokens allowing a message to become legible.

ProPublica notes that at least 1,000 moderators employed by Facebook’s moderator contract firm Accenture review user-reported content that’s been flagged by its machine learning system. They monitor for, among other things, spam, disinformation, hate speech, potential terrorist threats, child sexual abuse material (CSAM), blackmail, and “sexually oriented businesses.” Based on the content, moderators can ban the account, put the user “on watch,” or leave it alone. (This is different than Facebook or Instagram which also allows moderators to remove individual posts.) In an op-ed for Wired earlier this year, WhatsApp head Will Cathcart wrote that the company submitted “400,000 reports to child safety authorities last year and people have been prosecuted as a consequence.”


Most can agree that violent imagery and CSAM should be monitored and reported; Facebook and Pornhub regularly generate media scandals for not moderating enough. But WhatsApp moderators told ProPublica that the app’s artificial intelligence program sends moderators an inordinate number of harmless posts, like children in bathtubs. Once the flagged content reaches them, ProPublica reports that moderators can see the last five messages in a thread.

WhatsApp discloses, in its terms of service, that when an account is reported, it “receives the most recent messages” from the reported group or user as well as “information on your recent interactions with the reported user.” This does not specify that such information, viewable by moderators, could include phone numbers, profile photos, linked Facebook and Instagram accounts, their IP address, and mobile phone ID. And, the report notes, WhatsApp does not disclose the fact that it amasses all users’ metadata no matter their privacy settings.

The collection of messages contradicts WhatsApp’s big public showing earlier this year in a lawsuit against the Indian government. Fighting a new law that would likely have allowed Indian law enforcement officials to trawl suspects’ messages, the company said in a statement shared with Reuters:

Requiring messaging apps to ‘trace’ chats is the equivalent of asking us to keep a fingerprint of every single message sent on WhatsApp, which would break end-to-end encryption and fundamentally undermines people’s right to privacy.


But, similar to Facebook, WhatsApp does seem enthusiastic about sharing metadata with U.S. law enforcement, including data that has helped shield the government from accountability. In a case against a Treasury Department whistleblower who shared classified documents with BuzzFeed, prosecutors submitted the fact that Natalie Edwards had exchanged dozens of messages with a reporter around the time of publication. Edwards now faces a six-month prison sentence.

Law enforcement can get a court-ordered subpoena for that information, but WhatsApp can also choose not to store the information—its competitor Signal claims the only metadata it collects is your contact information. If WhatsApp offered Signal’s feature to encrypt metadata as well, the company wouldn’t be able to share anything if they’d wanted to.


WhatsApp didn’t offer much clarity on what mechanism it uses to receive decrypted messages, only that the person tapping the “report” button is automatically generating a new message between themselves and WhatsApp. That seems to indicate that WhatsApp is deploying a sort of copy-paste function, but the details are still unclear.

Facebook told Gizmodo that WhatsApp can read messages because they’re considered a version of direct messaging between the company and the reporter. They added that users who report content make the conscious choice to share information with Facebook; by their logic, Facebook’s collection of that material doesn’t conflict with end-to-end encryption.


So, yes, WhatsApp can see your messages without your consent.