How privacy protections are weakened for its 2 billion WhatsApp users by Facebook

Users are assured by WhatsApp that no one can see their messages but the company has an extensive monitoring operation

Update: 2021-09-09 04:00 GMT

How privacy protections are weakened for its 2 billion WhatsApp users by Facebook Users are assured by WhatsApp that no one can see their messages but the company has an extensive monitoring operation and regularly shares personal information with prosecutors. Mark Zuckerberg cited Facebook's global messaging service, WhatsApp, as a model when unveiling a new "privacy-focused vision"...

How privacy protections are weakened for its 2 billion WhatsApp users by Facebook

Users are assured by WhatsApp that no one can see their messages but the company has an extensive monitoring operation and regularly shares personal information with prosecutors.

Mark Zuckerberg cited Facebook's global messaging service, WhatsApp, as a model when unveiling a new "privacy-focused vision" for Facebook in March 2019. "We don't currently have a strong reputation for building privacy protective services," the Facebook CEO acknowledged and wrote that "I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won't stick around forever. This is the future I hope we will help bring about. We plan to build this the way we've developed WhatsApp."

WhatsApp's signature feature – end-to-end encryption - was the focus of Zuckerberg's vision. which he said the the company was planning to apply to Instagram and Facebook Messenger.

End-to-end encryption converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. He said that WhatsApp messages are so secure that nobody else — not even the company — can read a word. Zuckerberg in his testimony to the US Senate in 2018 said, "We don't see any of the content in WhatsApp."

The point is emphasised by WhatsApp consistently in the form of a similar assurance that appears on screen before users send messages "No one outside of this chat, not even WhatsApp, can read or listen to them."

But the assurances are not true for WhatsApp has over 1,000 contract workers filling the floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users' content.

While seated at computers in pods organised by work assignments, these hourly workers use special Facebook software to sift through streams of private messages, images and videos that WhatsApp users have reported as improper. These are then screened by the company's artificial intelligence systems. Judgment is passed in typically less than a minute on whatever flashes on their screens by these contractors – claims of everything from fraud or spam to child porn and potential terrorist plotting.

While WhatsApp assures users that their privacy is sacrosanct, it is also policing them which makes for an awkward mission.

ProPublica obtained a 49-slide internal company marketing presentation from December which emphasises the "fierce" promotion of WhatsApp's "privacy narrative."

WhatsApp's "brand character" is compared to "the Immigrant Mother" and a photo of Malala Yousafzai, who survived a shooting by Taliban and became a Nobel Peace Prize winner, is displayed in a slide titled "Brand tone parameters." The presentation does not mention the company's content moderation efforts. Teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove "the worst" abusers, WhatsApp's director of communications, Carl Woog, acknowledged.

But he told ProPublica that the company does not consider this as content moderation saying, "We actually don't typically use the term for WhatsApp. WhatsApp is a lifeline for millions of people around the world. The decisions we make around how we build our app are focused around the privacy of our users, maintaining a high degree of reliability and preventing abuse."

Meanwhile, Facebook says of WhatsApp's corporate siblings, Instagram and Facebook that some 15,000 moderators examine content on Facebook and Instagram, neither of which is encrypted. Quarterly transparency reports are released detailing how many accounts Facebook and Instagram have actioned for various categories of abusive content. There is no such report for WhatsApp.

Deploying an army of content reviewers is just one of the ways that Facebook has compromised the privacy of WhatsApp users. Together, the company's actions have left WhatsApp – the largest messaging app in the world, with two billion users – far less private than its users are likely understand or expect.

A ProPublica investigation, drawing on data, documents and dozens of interviews with current and former employees and contractors, reveals how, since purchasing WhatsApp in 2014, Facebook has quietly undermined its sweeping security assurances in multiple ways.

Tags:    

Similar News