When Mark Zuckerberg appeared before Congress last week, several lawmakers grilled him on Facebook’s failures to comply with a 2012 consent decree that required the social media company to submit routine privacy audits to the Federal Trade Commission.
The FTC’s privacy audits are widely viewed as one of the strongest enforcement mechanisms for keeping tech companies in line with the privacy promises they make to their users—but a new paper by privacy attorney Megan Gray suggests that the FTC’s privacy audits are relatively toothless and need to undergo major reforms if they are really going to protect consumers. Although Gray currently works for the FTC, the paper is based on publicly-available documents and written during her non-work time.
Facebook’s 2011 consent decree stemmed from allegations that will sound eerily familiar to anyone paying attention to the company’s current Cambridge Analytica scandal. The FTC said that Facebook was unfair and deceptive in its communications to users about their privacy, and that it allowed third-party applications to access user data while leading users to believe that they could restrict the visibility of that data to their friends only. Under Facebook’s agreement with the FTC, the company must undergo biannual privacy audits for 20 years to ensure that it isn’t misleading users about their privacy.
However, those audits didn’t turn up any information about Cambridge Analytica’s collection of data on 87 million Facebook users, even though Facebook learned about it in 2015, leading members of Congress to suggest that Facebook isn’t complying with the consent decree.
Allowing apps like the one used by Cambridge Analytica to access user data was an act of “willful blindness,” Senator Richard Blumenthal told Zuckerberg. “It was heedless and reckless, which, in fact, amounted to a violation of the FTC consent decree,” he said. (Zuckerberg responded that, while he thought in retrospect that Facebook should have notified the FTC of Cambridge Analytica’s actions, it had no legal obligation to do so.)
Although members of Congress seemed to blame Facebook for not coming clean to the FTC, Gray’s paper suggests that the FTC’s privacy audits are failing by design. The current process essentially allows companies under consent orders to self-regulate, while data misuse by the likes of Cambridge Analytica is swept under the rug. If the agency truly wants to catch privacy violations and protect consumers from harm, it needs to make its auditing process more aggressive and rigorous, according to Gray.
“The agency regularly touts its important and extensive work as the chief consumer privacy ‘cop on the beat.’ But this chest-thumping can backfire—consumers may more readily share personal information via online platforms based on a belief that the FTC is guarding against misuse,” Gray writes. “Careful review, however, shows the audits are woefully inadequate.”
The FTC’s 2012 consent decree against Facebook, and a 2011 decree against Google, were highly regarded as strong actions to protect consumer privacy. But the audit process laid out in those consent orders isn’t living up to the hype, Gray says.
“On closer inspection, the orders arguably did not require ‘reasonable privacy protections,’” she writes. “Rather, the orders were more constrained, and required only a ‘comprehensive privacy program’ that was ‘reasonably designed’ to ‘address’ ‘privacy risks.’ Under this language, given the companies’ lengthy privacy policies essentially stating that users did not have any privacy, the FTC could face an uphill battle in asserting misuse of consumer data.”
Companies like Facebook and Google are allowed to hire their own auditors, and the contracts between auditors and companies are not public, so it’s difficult to determine how much a company spent on an audit. The audits rely largely on statements by company employees—basically, if executives tell the auditor that they’re doing enough to protect user privacy, the auditor reports that the company is fulfilling its obligations.
“That is completely useless. It’s not just toothless, it’s worse than toothless,” Nate Cardozo, a senior staff attorney with the Electronic Frontier Foundation, told Gizmodo. “It’s asking the fox to guard the henhouse. If the FTC had chosen an auditor and required Facebook to open its servers to any question the auditor had, maybe we wouldn’t have gotten to Cambridge Analytica.”
The audits also tend to only scratch the surface of what a large company’s privacy reach might be. Google’s audit, for instance, covers only seven points—which Gray calls “so vague or duplicative as to be meaningless”—and doesn’t delve into the various privacy concerns that come up in Google’s myriad products, from search to email to YouTube to self-driving cars.
Allowing the auditors to parrot whatever a company tells them creates an environment where privacy violations go unreported. “The FTC’s privacy cases have not usually stemmed from intentional transgressions; rather, the cases usually arise from issues the company overlooked or did not adequately disclose to consumers. A privacy audit that relies on management assertions will rarely uncover these blind spots,” Gray writes.
Security expert Alec Muffett puts the problem another way:
In sporting metaphor: a vendor (in this case, Google) gets to design their own high-jump bar, document how tall it is and what it is made of, how they intend to jump over it; and then they jump over it and the certification agency simply attests that they have successfully performed a high-jump over a bar of their own design. The design documents and jump technique do not need to be made public.
The FTC is aware that the audit process is broken, Gray notes. The World Privacy Forum recommended last fall that the agency go so far as to instruct its employees to stop referring to the work as “audits” altogether and use the phrase “assessments” instead.
“We suggest that any Commission staff member who discusses a Commission consent decree in public and who refers to an assessment as an audit be required to stay after work and write 100 times ‘An assessment is not an audit....’” WPF’s executive director Pam Dixon wrote in a comment to the FTC on its consent decree against Uber.
Gray makes a number of recommendations for improving the FTC’s privacy audits. The most drastic, she says, would be for the FTC to stop relying on corporate attestations of privacy protection altogether. However, the current model could be improved if the FTC outlined what it expects auditors to cover with more granularity, including mapping the flow of consumer data through a company’s systems and analyzing violations of the order that occurred while the company was under scrutiny.
“Ms. Gray has no involvement in any current privacy or data security investigation or litigation at the FTC, including the Facebook investigation. Her article and any of her other related comments represent her personal opinion and not the views of the FTC,” a FTC spokesperson said.
Gray also recommends that the FTC incorporate industry-standard principles for data privacy into the audit process, such as the Generally Accepted Privacy Principles and the Fair Information Practices, so that companies are evaluated against more widely accepted rubrics. Although some of the GAPP or FIP recommendations might not apply to a particular product—for instance, an auditor might not be able to assess a product that only stores data while in transit under the recommendations for data retention—an auditor should still address those principles and explain why they do not apply, the paper argues.
“The FTC will soon have an entirely new slate of commissioners,” Gray notes. “They may be amenable to a comprehensive overhaul of how the agency monitors its privacy orders.”
Whether the new commissioners will be interested in cracking down on Silicon Valley’s privacy violations remains to be seen. “The Trump administration has made it clear that it is no friend of Silicon Valley and, in this particular context, that could be good for user privacy,” Cardozo said. “On the other hand, the Trump administration has made it clear that it doesn’t like government regulation and the administrative state.”
Correction: A previous version of this article incorrectly stated that Facebook consent decree was issued in 2011 and Google’s was issued in 2012. In fact, it was the other way around: Facebook’s consent decree was finalized in August 2012, while Google’s was finalized in October 2011. We regret the error.