[Update: Facebook issues clarification] Facebook can reportedly read some of your WhatsApp messages
The damning report comes from ProPublica, a non-profit investigative journalism organization with a solid track record. It claims (via 9to5Mac) that both Facebook and WhatsApp can view the contents of your private WhatsApp messages. The report notes:
[An] assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”
Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users’ content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through streams of private messages, images and videos that have been reported by WhatsApp users as improper and then screened by the company’s artificial intelligence systems. These contractors pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.
Since WhatsApp maintains that it uses end-to-end encryption, the aforementioned moderators shouldn’t be able to see the contents of your messages. That’s because end-to-end encryption should mean that only the sender and the recipient have the ability to decrypt messages. But that doesn’t seem to be the case.
The report further notes:
Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.
In response to the report, a WhatsApp spokesperson said: “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.” While the spokesperson didn’t directly address the alleged lack of end-to-end encryption, they added that “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp we receive the content they send us.”
If the details mentioned in the ProPublica report are accurate, both Facebook and WhatsApp could get in some serious trouble. 9to5Mac speculates that there might have been some misunderstanding during the investigation, and the moderators could be reviewing Facebook messages and not WhatsApp messages. But ProPublica claims that WhatsApp’s director of communications, Carl Woog, “acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove “the worst” abusers.” Woog also told ProPublica that the company doesn’t consider this work to be content moderation, and added, “We actually don’t typically use the term for WhatsApp.”
Furthermore, the report cites a confidential whistleblower complaint filed last year with the U.S. Securities and Exchange Commission to solidify its claims. The complaint details WhatsApp’s use of external contractors, AI systems, and account information to “examine user messages, images and videos. It alleges that the company’s claims of protecting users’ privacy are false.” The SEC hasn’t taken any public action on this complaint.
It’s worth noting that the ProPublica report clarifies that WhatsApp moderators only get access to reported messages. Be that as it may, neither WhatsApp nor Facebook should be able to see the contents of your messages if they’re truly end-to-end encrypted.
Update 1: Facebook issues clarification
In a statement to 9to5Mac, the company has further revealed that when you use WhatsApp’s Report feature, the message is automatically forwarded to Facebook. This process works exactly like manually forwarding a message to a friend. Tapping on the report button creates a new end-to-end encrypted message that goes to Facebook’s moderators. They are then able to review the message, along with four preceding messages from the same chat. This provides moderators with enough context to evaluate the offending message. The company maintains that it can’t see other messages that are not reported.
Update 2: Statement from WhatsApp
In a statement, a spokesperson for WhatsApp states that the company denies its Report feature is incompatible with end-to-end encryption. The statement reads as follows:
“WhatsApp provides a way for people to report spam or abuse, which includes sharing the most recent messages in a chat. This feature is important for preventing the worst abuse on the internet. We strongly disagree with the notion that accepting reports a user chooses to send us is incompatible with end-to-end encryption.” – WhatsApp spokesperson