Election Day is coming up, and if you use Facebook, you'll see an option to tell everyone you voted. This isn't new; Facebook introduced the "I Voted" button in 2008. What is new is that, according to Facebook, this year the company isn't conducting any experiments related to election season.
That'd be the first time in a long time. Facebook has experimented with the voting button in several elections since 2008, and the company's researchers have presented evidence that the button actually influences voter behavior.
Mother Jones looked at the social network's history of political meddling, including an experiment it conducted during the 2010 elections where it only showed the "I Voted" button to some users to see if people who looked at it behaved differently:
Their paper, with the astounding title "A 61-Million-Person Experiment in Social Influence and Political Mobilization," found that about 20 percent of the users who saw that their friends had voted also clicked on the "I Voted" button, compared to 18 percent of the people who didn't get the "I Voted" message from their friends. That is, positive social pressure caused more people to vote (or at least to tell their friends they were voting). After the election, the study's authors examined voter records and concluded that Facebook's nudging had increased voter turnout by at least 340,000.
Facebook's experiments in 2012 are also believed to have influenced voter behavior. Of course, everything is user-reported, so there's no way of knowing how many people are being honest and who is lying; the social network's influence could be larger or smaller than reported.
Facebook has not been very forthright about these experiments. It didn't tell people at the time that they were being conducted. This lack of transparency is troubling, but not surprising. Facebook can introduce and change features that influence elections, and that means it is an enormously powerful political tool. And that means the company's ability to sway voters will be of great interest to politicians and other powerful figures.
Perhaps that is why Facebook says it is not going to conduct any experiments during the upcoming election. I've contacted Facebook to ask why they've decided to stop experimenting with the voting tool; it seems like a strange move. Obviously the company should be more transparent about creating experiments that demonstrably influence elections (and, really, the experiments are disturbing examples of Facebook's ability for social engineering, an ability examined by sociologist Zeynep Tufeci and referenced by Mother Jones).
But...why stop now, from Facebook's perspective? Perhaps the resoundingly negative response to its experiment on how the News Feed can influence emotion has prompted caution.
Or maybe they're still experimenting and just not telling us. [Mother Jones]
Photo via Playerx/Flickr (CC BY 2.0)