It’s no surprise that people tamper with Wikipedia entries on a regular basis, but it turns out that especially dedicated trolls have been sabotaging entries on politically controversial science topics like evolution and global warming.
Researchers Gene Likens of the University of Connecticut and Adam Wilson of the University of Buffalo found that hot-button science topics which, despite consensus in the scientific community, still have detractors in the general public, get edited more heavily and more often than less controversial science topics. And they’re worried about what that might mean for readers’ access to reliable information on important topics like global warming. They recently published their results in the journal PLOS ONE.
“Following a long-standing research interest and expertise in acid rain, we noticed that some corrections we or others made on the acid rain article had been changed by major edits to introduce (or re-introduce) balderdash and factual errors into the content,” wrote Likens and Wilson . They wanted to know if other science topics received the same treatment.
The researchers looked at Wikipedia entries for three topics which, although scientifically well established, are still the subject of political controversy: acid rain, evolution, and global warming. They compared the three hot-button topics with four topics that even the most anti-science hardliners don’t usually dispute: continental drift, general relativity, heliocentrism, and the standard model in physics.
Likens and Wilson downloaded the full edit history for all seven articles, which gave them almost ten years’ worth of data, including each article’s average number of daily edits, the size of an average edit, and how many people read each article on an average day.
They found that politically controversial science topics were edited much more heavily than other science topics. Articles on acid rain, global warming, and evolution were edited more times each day than the other four articles, and their edits involved larger changes, on average. For example, the global warming entry is edited two or three times on an average day, and each edit changes more than 100 words of the article; meanwhile, the entry for the standard model in physics only has about 10 words changed every few weeks. Acid rain got less attention than global warming and evolution, but still more than the four politically neutral topics.
More people read the three politically controversial entries each day, too. Global warming averaged 15,000 to 20,000 pageviews a day, compared with only 1,000 to 1,500 for heliocentrism. The higher traffic to these articles probably explains at least some of the difference in editing rates, but it also means that more people are turning to these articles for accurate information - and “edit wars” mean they may not be getting it.
How Wikipedia Works - and Sometimes Doesn’t
Wikipedia’s content is written and maintained by a large community of volunteer editors, who can add, remove, rearrange, or rewrite material in any article, any time. When an editor makes a change, it’s up to other editors to notice and, if necessary, revert the article to the previous version or make their own edits. If there’s a dispute, editors can discuss it on a “talk” page dedicated to each article.
In theory, this sounds like a great system, as long as everyone involved is making a legitimate effort to create good, factual entries. According to Likens and Wilson, it’s actually not so different from the process of peer review that’s used to evaluate scientific research, except that on Wikipedia, things get published before they’re reviewed. There’s also no way to permanently reject incorrect material, so some of it just keeps coming back.
And, wrote the researchers, “the motivation, commitment, and qualifications of Wikipedia’s editors are typically unknown (especially for anonymous edits).” So, like everything on the internet, trolls spoil Wikipedia’s editing system. If someone starts an “edit war,” adds misinformation, or just posts something ridiculous as a prank, thousands of people will read the incorrect version before it’s fixed.
It’s sadly unsurprising that, of course, people make prank edits or post misinformation on Wikipedia all the time, but it’s more significant that these bad editors are targeting science topics at the center of public policy discussions, which are still polarizing issues for the general public.
Fighting the Trolls is Everyone’s Job
When a page gets edited several times a day, it’s hard for legitimate editors, especially experts on the topic, to keep up. “On entries subject to ‘edit wars,’ like acid rain, evolution, and global [climate] change, one can obtain — within seconds — dramatically different information on the same topic,” Lewis said in a recent statement.
Wikipedia tries to keep “edit wars” and trolling under control, but it can only do so much. Some topics have “protected status” to block anonymous edits, for instance, which is supposed to cut down on vandalism, and a rule blocks editors from reverting an article to the previous version more than three times in the same day, which is supposed to prevent “edit wars.” The site also has algorithms that detect obvious things like profanity. They’re not perfect, but they’re constantly improving. More subtle malicious edits are harder to detect, however, and it’s up to editors to find and fix them manually.
It’s also up to readers to think critically about sources and understand Wikipedia’s limitations, say the researchers. Wikipedia requires a source to back up each fact in an entry, and editors are expected to find and remove facts that don’t have good sources. Increasingly, on Wikipedia’s science pages, those sources are established scientific journals. That’s a good thing for readers, and it makes it easier to verify that information is accurate and comes from a good source. Wilson and Liken encourage readers to check sources, especially on pages for politically controversial topics.
But ultimately, it pays to keep in mind what Wikipedia is and what it isn’t. “What is needed is a wider appreciation of how to best leverage the vast quantity of information in Wikipedia to take advantage of its strengths (vast coverage and frequent updates) and avoid its weaknesses (potential for errors, conflict between editors, and content stability),” wrote the Likens and Wilson.
And Wikipedia itself says, “It is in the nature of an ever-changing work like Wikipedia that, while some articles are of the highest quality of scholarship, others are admittedly complete rubbish. We are fully aware of what it is and what it isn’t. Also, because some articles may contain errors, please do not use Wikipedia to make critical decisions.”
UPDATE 8/17: The Wikimedia Foundation, the nonprofit organization that runs Wikipedia, has expressed concerns about the study’s conclusions. A representative told Gizmodo, “We find some of the coverage of this study overstates findings, or infers facts not in evidence. For example, the authors of this study do not seem to have successfully correlated the frequency of edits to controversial articles with an increased likelihood of inaccuracy. Instead, the study simply seems to confirm that the articles chosen as controversial are, in fact, controversial.”
Wikipedia’s editors are currently critiquing the study and its methodology. Several of the editors involved in the discussion say the study only stated the obvious: of course controversial articles will see more malicious edits, which then require more edits to fix.
At least some of the difference in editing rates between politically controversial and noncontroversial topics may be linked to how much new research is being done on these topics. There’s little or no new information to be added to an entry on heliocentrism, for instance, but global warming and evolution are still active areas of research, and new findings require Wikipedia updates. In fact, one Wikipedia editor noted that the articles that were edited most frequently in the PLOS ONE paper also had more new references added during the period covered by the study.
Editors have also questioned the study’s statistical methods, and some say the researchers used too few articles to reach their conclusions.
Others acknowledge that advocates of unscientific positions on topics like climate change aren’t uncommon on Wikipedia and that their actions harm the site’s coverage of those topics. The Wikimedia Foundation, however, says that Wikipedia’s editors are on top of things. “Volunteer editors and administrators regularly ensure content meets the site’s policies and guidelines. Vandalism and inaccuracies occur, but thanks to Wikipedia’s open, collaborative model the vast majority of inaccurate content is removed within minutes,” said a representative.