TikTok, once viewed by many as a type of digital life raft largely removed from politics, may be “consistently and dramatically suppressing,” nonpartisan voting content on its platform amid the high stakes midterm elections. If confirmed, the supposed suppression efforts could risk reigniting still smoldering concerns about social media’s behind the scenes impact on political participation. TikTok’s foreign ownership and embattled history with U.S. lawmakers risks exacerbating those concerns even further.
The research, conducted by non-profit media lab Accelerate Change and shared with Gizmodo, suggests videos with influencers using election related words were viewed far less than nearly identical videos where those same terms were not said. Accelerate Change says its research has so far led to 370,000 views from 20 different paired videos. The pairs of videos were nearly identical, with one pair including verbal uses of political words like “mid-terms,” and “get out the vote,” and the other featuring those words not spoken but hand written on a sign. The TikToks with the handwritten election terms reportedly received three times as many views as the videos where influencers spoke election related words out loud.
Though it’s difficult to make a conclusive statement on something as complex as a notoriously black boxed social media algorithm based on a single limited study, Accelerate Change’s President Peter Murray believes the implications of the findings are clear: TikTok’s algorithm suppresses election content.
“Often with an algorithm performance experiment like this, you struggle to see a pattern in the data, but in this case the result was dramatic and clear: TikTok is suppressing more than 65% of voting video views,” Murray said.
Ahead of the U.S. midterm elections, TikTok did announce some changes to its content policy regarding politics. The company moved to bar videos that contain political fundraising efforts and also mandated verification for U.S.-based government and political accounts. The study acknowledged some of those policy changes but maintains the social media giant’s not doing enough to encourage voter engagement.
In an email sent to Gizmodo, a TikTok spokesperson questioned various elements of Accelerate Change’s research methodology and said the company does not have any special guidelines for “political content” broadly. (TikTok does have different policies related to accounts belonging to governments, politicians, and political parties, however, those should not apply to Accelerate Change’s influencer uploaded videos.)
“All content—audio, visuals, text, stickers, captions, etc.—is moderated in accordance with our Community Guidelines which apply to everyone and everything on the platform, and we strive to consistently and accurately enforce these policies,” the spokesperson added.
On the methodology front, the spokesperson took issue with the fact that the paired videos were apparently posted at different times and different days, variables which could contribute to a difference in views garnered that has nothing to do with the use of the political terms. One of the verbal only videos, according to the TikTok spokesperson, was allegedly deleted after it garnered 15K views. The spokesperson claims none of the posts included in the Accelerate Change report were “moderated.”
In an email exchange with Gizmodo, Murray acknowledged one the the influencers had indeed deleted a verbal post in error after it had amassed around 15K views, but said it, when combined with a reposted version of that video still amassed far fewer views than the nearly identical version with the written out election words. As for the methodology criticisms, Murray said he and his team randomized the influencers’ posting in an attempt to control for those variables.
Murray particularly took issue with the spokesperson’s claim that the social media firm did not moderate any of the Accelerate Change’s videos, arguing those choice of words danced around the issue of algorithmic suppression entirely.
“This is a dodge,” Murray told Gizmodo. “TikTok only moderates a small fraction of videos, but their algorithm is tuned to downgrade videos with voting words (and likely political words as well). Not being moderated doesn’t mean that they weren’t automatically suppressed in the algorithm.”
The research gained the attention of some tech policy experts like Tech Oversight Project Executive Director Sacha Haworth who described the current interplay between social media and politics as a, “code red moment.”
“We have been sounding the alarm for weeks,” Haworth told Gizmodo. “Whether it is TikTok, YouTube, Twitter, or Facebook, Big Tech platforms have abandoned their role in ensuring free and fair elections in the United States. They are spreading disinformation and suppressing basic pro-democracy information like where to find your polling place.”
In addition to the paired video experiment, Accelerate Change says it conducted a hash analysis and estimated TikTok has over 25 billion views of voting messages on the platform. The company used its previous findings to argue the social media company could be limiting 30 billion views of voting messages in 2022.
Here too, the TikTok spokesperson pushed back and said it was “unclear” how they arrived at their hash estimates. Gizmodo could not independently verify these projections. In response to Gizmodo’s emails Murray said he expected his firm’s analysis was actually undercounting the number of total voting video views that TikTok is suppressing.
Though other social media platforms like Facebook and YouTube have drawn heavy scrutiny over the past decade for spreading political misinformation and and even acting as vehicles for acts of political violence, Accelerate Change claims these findings would represent the first proven instance of, “widespread suppression of nonpartisan voting content.”
The findings come just weeks after an separate report published by Global Witness and the Cybersecurity for Democracy team at New York University found TikTok performed the worst among social media companies at detecting and preventing inaccurate political advertisements even though TikTok technically banned political ads since 2019.
“TikTok has failed to implement the basic nonpartisan voter engagement steps that other platforms like Instagram and Snapchat have implemented,” Murray told Gizmodo. “They have no comprehensive voter registration program for users and no comprehensive messaging of users about the upcoming elections.”
“Again, the evidence points to them consistently suppressing nonpartisan voting videos.”