In recent years, Reddit has banned a bevy of far-right troll havens, including its board for the white supremacist “alt-right” and others used for the harassment of women, minorities and other people. The bans were a reversal of Reddit’s prior policy to not ban “questionable” content—and drew predictable outrage, given that its policy of non-intervention had fostered an explosion of very active fringe communities, many of them far to the right or openly racist.
Members of the banned communities portrayed themselves as martyrs, while a slew of other Redditors argued either a free speech ethos requires tolerating hate speech or that the ban wouldn’t work. But a new study from Georgia Institute of Technology, Emory University and University of Michigan researchers suggests there’s little ambiguity: Banning horrible communities from taking root on Reddit worked.
The researchers pulled over 100 million Reddit posts from before and after administrators banned the fat-shaming r/fatpeoplehate and white supremacist r/CoonTown board in 2015. They then developed a metric to quantify the level of hate speech in each of them.
Of the thousands of active members of those boards, the team found, a significant percentage gave up and discontinued their accounts compared to the control group. Those that stayed were not whipped up into a greater fury, but instead decreased their level of hate speech “by at least 80 percent” in subsequent posts:
For the banned community users that remained active, the ban drastically reduced the amount of hate speech they used across Reddit by a large and significant amount. Following the ban, Reddit saw a 90.63% decrease in the usage of manually filtered hate words by r/fatpeoplehate users, and a 81.08% decrease in the usage of manually filtered hate words by r/CoonTown users (relative to their respective control groups). The observed changes in hate speech usage were verified to be caused by the ban and not random chance, via permutation tests.
According to the researchers, banning the two subreddits did not appear to “spread the infection” to others in some kind of rolling migration. Instead, r/CoonTown users tended to move to other forums where “racist behavior has either been noted or is prevalent,” including r/The_Donald, r/homeland and r/BlackCrimeMatters. Former r/fatpeoplehate users migrated to “qualitatively different” subreddits like r/RoastMe, or those dedicated to video games or TV shows.
As the researchers explained:
We observed no change in the hate speech usage of migrants in the invaded subreddits postban (p-value≥ 0.122; the lower-bound in Table 6), nor did we see any significant change in the hate speech usage of preexisting users in these subreddits (p-value≥ 0.136). In simpler terms, the migrants did not bring hate speech with them to their new communities, nor did the longtime residents pick it up from them.
In short, the team concluded, the ban worked. They theorized that although hordes of people in the banned communities flooded through other parts of the site, administrators were successful in shutting down duplicate subreddits and moderators were able to maintain control of the other places they ended up.
Beyond the initial bursts of anger, the site did not enter some kind of permanent rage spiral. By removing the communities driving the worst behavior on the site, Reddit purged part of the problem.
The team did find evidence the bans sent racist, fat-shaming Redditors searching for other sites—making them “someone else’s problem.” But the places they ended up tended to be “darker corners of the internet” like “Voat, Snapzu, and Empeopled.”
Shifting the buck doesn’t end the underlying problem, you say? That’s not the point of a moderation policy. From Reddit’s perspective, it doesn’t matter where Nazis are if they’re not ruining the site—and it probably doesn’t matter to other Redditors, either. Weeding your garden has a negligible impact on the worldwide population of weeds, but without weeding you don’t have a garden at all. You just got weeds.
Moreover, there’s only so many places to shift the buck to. After an alleged neo-Nazi terror attack at a white supremacist rally in Charlottesville last month, tech companies have raced to enforce policies against hate speech, driving prominent neo-Nazi sites like Stormfront and the Daily Stormer off the public face of the web entirely. At some point, like when crowdfunding services like Patreon decided racists were no longer welcome, this has an impact on the far right’s ability to recruit and organize.
Unfortunately, as Gizmodo reported last year, if the strategy worked in 2015, Reddit has not been consistent in its application—especially in the era of Donald Trump, when subreddits like r/The_Donald have seemingly become untouchable due to fears of another uprising or attacks Reddit is politically biased.
There’s always the omnipresent and uncomfortable feeling that a number of web giants increasingly control access to the public sphere and could abuse that power, of course. But on the other hand, as has become repeatedly apparent with sites like Facebook, the powers that be are often disinterested in policing what goes on at all or set up bare-bones moderation that doesn’t protect users from harassment. This report is just a little more evidence that if they cared, they could try a little harder.