Complaints Facebook whistleblower Frances Haugen filed to the Securities and Exchange Commission (SEC) accuses the company of lying to the agency, investors, journalists, and Congress.
Before leaving the company in May 2021, Haugen, a former Facebook product manager for civil misinformation, made sure to take thousands of internal documents with her. She later became the source for a damning Wall Street Journal series detailing everything from how Facebook downplayed internal research showing the psychological harm Instagram causes for some young women to turning a blind eye to reports of human trafficking and a secret “XCheck” program exempting certain users from its rules. Haugen unveiled her identity in a lengthy interview on 60 Minutes on Sunday, with the network now reporting she has filed at least eight separate complaints with the SEC.
At the core of the complaints, first reported by CBS’s 60 Minutes, Haugen’s lawyers wrote, is the claim that in public statements Facebook misrepresented the scale and its awareness of severe problems with Facebook products to investors so wantonly that it violated securities laws. Facebook is a public company, and the complaints argue the extent of the lies and omissions in its corporate SEC filings also went well beyond what’s legal. Haugen’s filings with the SEC state:
Our anonymous client is disclosing original evidence showing that Facebook, Inc. (NASDAQ: FB) has, for years past and ongoing, violated U.S. securities laws by making material misrepresentations and omissions in statements to investors and prospective investors, including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.
Specific examples cited in the filings include claims that Facebook and its subsidiary Instagram knew, as of 2019, that their products were used “promote human trafficking and domestic servitude.” The documents provided to the Wall Street Journal showed that Facebook only took action when the problem grew so significant that Apple threatened to remove its apps from the App Store.
“Our investigative findings demonstrate that ... our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks,” an internal Facebook document cited in the filing concluded.
The SEC filings also state that Facebook “failed to deploy internally-recommended or lasting counter-measures” against voter fraud conspiracy theories and violent rhetoric that circulated during the 2020 elections and before Donald Trump supporters stormed the Capitol on Jan. 6 in an effort to overturn the results.
One complaint, titled, “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection,” argues that Facebook research made it well aware its recommendations algorithm “can veer people interested in conservative topics into radical or polarizing ideas and groups/pages.” Specifically, it cites a study that found that new users who followed “verified/high quality conservative pages” like Donald Trump or Fox News saw their recommendations “include conspiracy recommendations after only 2 days.”
Haugen’s lawyers wrote to the SEC that Facebook CEO Mark Zuckerberg misled investors in 2018 by announcing that Facebook would now emphasize “meaningful social interactions” (MSI) over engagement. One of the internal documents states that the MSI approach made misinformation and hate speech worse, as “the more negative comments a piece of content instigates, the higher likelihood for the link to get more traffic.” Another filing, claiming the company lied about its handling of hate speech, refers to an internal Facebook study that concluded, “We only take action against approximately 2% of the hate speech on the platform. Recent estimates suggest that unless there is a major change in strategy, it will be very difficult to improve this beyond 10-20% in the short-medium term.”
Another document provided by Haugen states, “We’re deleting less than 5% of all of the hate speech posted to Facebook. This is actually an optimistic estimate.”
One of the filings accuses Facebook CEO Mark Zuckerberg of lying to Congress about Instagram’s impact on young girls in March 2020; while Zuckerberg told Congress he didn’t believe that it hurt them, the company had already conducted internal research concluding, “We make body image issues worse for 1 in 3 teen girls.” (Facebook later downplayed this finding, claiming it only applied to a subset of respondents who “were experiencing body image issues.” The company has only selectively released its internal research.)
Finally, other filings accuse Facebook and its executives of issuing misleading statements about its equal enforcement of the rules despite the existence of the XCheck program, of pretending it takes global action against ethnic violence despite knowing it was hobbled by a lack of language skills and local staff, and presenting cooked books about user demographics to advertisers and investors.
Facebook responded to 60 Minutes (or previously commented to the Journal) denying or contesting each of the claims raised by Haugen. In a representative statement t0 CBS on the hate speech issue, Facebook’s director of policy communications, Lena Pietsch, effectively argued the company did the best it could:
We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps.
On Tuesday, Haugen testified before a Senate Commerce subcommittee regarding Facebook’s practices.
“The reality is that we’ve seen from repeated documents within my disclosures, is that Facebook’s AI systems only catch a very tiny minority of offending content,” Haugen told lawmakers on Tuesday. “And best case scenario, and the case of something like hate speech, at most they will ever get 10 to 20 percent. In the case of children, that means drug paraphernalia ads like that, it’s likely if they rely on computers and not humans, they will also likely never get more than 10 to 20 percent of those ads.”
In another exchange with legislators, Haugen described a cyclic “pattern of behavior” in which “problems were so understaffed that there was kind of an implicit discouragement from having better detection systems.”
“My last team at Facebook was on the counterespionage team within the threat intelligence org, and at any given time, our team could only handle a third of the cases that we knew about,” Haugen told the senators. “We knew that if we built even a basic detecter, we would likely have many more cases.”
Facebook suffered a major outage that affected both its main site and all its subsidiaries, Instagram and WhatsApp, for most of the work day on Monday. The company has also seen double-digit stock losses over the past few weeks, though it has shrugged off similar downturns in the past and sat at nearly $334 on early Tuesday afternoon, up from around $264 a year ago.
The SEC told 60 Minutes it “does not comment on the existence or nonexistence of a possible investigation.”