On Monday, Facebook announced a new initiative that aims to help the company better understand how it influences elections and democracy at large. Thing is, it will involve Facebook handing over user data to independent researchers—but in a good way this time, promise.
In their announcement blog, Facebook’s vice president of communications and public policy, Elliot Schrage, and Director of Research David Ginsberg wrote that an independent research commission will be able to give “privacy-protected data” to select researchers “when appropriate.”
“The goal is both to get the ideas of leading academics on how to address these issues as well as to hold us accountable for making sure we protect the integrity of these elections on Facebook,” Mark Zuckerberg, Facebook’s CEO and co-founder, wrote in a post on his page.
It’s worth noting that the individual at the center of the Cambridge Analytica scandal, Dr. Aleksandr Kogan, used his position as a University of Cambridge professor to develop the app he would later repurpose to collect data on some 87 million Facebook users for Cambridge Analytica’s political influence campaigns. That being said, Facebook does detail a few ways in which it will attempt to protect user data through the new initiative.
Proposals requesting data will be reviewed both by a university Institutional Review Board (IRB), Facebook’s privacy and research review teams, and independent privacy experts chosen by the commission, according to the company. “These reviews will help ensure that Facebook acts in accordance with its legal and ethical obligations to the people who use our service, as well as the academic and ethical integrity of the research process,” Schrage and Ginsberg state in the post.
Facebook notes that it is developing its own team to work alongside the independent commission and researchers when it comes to developing the datasets used in the research. It states that these datasets will remain on the company’s servers and will be regularly audited. The commission will only publish results that are aggregated and anonymous, Facebook said.
“Fundamental to this entire effort is ensuring that people’s information is secure and kept private,” Schrage and Ginsberg wrote. “Facebook and our funding partners recognize the threat presented by the recent misuse of Facebook data, including by an academic associated with Cambridge Analytica. At the same time, we believe strongly that the public interest is best served when independent researchers have access to information. And we believe that we can achieve this goal while ensuring that privacy is preserved and information kept secure.”
It’s unclear what type of user data is up for grabs as part of the initiative. We have reached out to Facebook for more information on what user data can be shared with researchers and will update when we hear back.
What is clear is that Facebook is trying to prove to the public that it really does care about how its products might shape the world we live in, not just as a platform to build communities but as a platform to psychologically manipulate them. The announcement also comes the same week Zuckerberg is scheduled to testify before the Senate Judiciary and Commerce committees on Tuesday and the House Energy and Commerce Committee on Wednesday. It will certainly bode him well to have an arsenal of good intentions at his disposal for when the grilling begins.