The Future Is Here
We may earn a commission from links on this page

TikTok and Bumble Are Cracking Down on Nonconsensual Nude Uploads

The social media platform and dating app have joined the fight against the nonconsensual sharing of intimate photos.

We may earn a commission from links on this page.
The platforms are using a self-reporting workflow that Meta first started using in Australia in 2017.
The platforms are using a self-reporting workflow that Meta first started using in Australia in 2017.
Image: XanderSt (Shutterstock), Boumen Japet (Shutterstock)

What’s sometimes called “Revenge porn” is a form of abuse where someone’s intimate photos are used against them, usually by an ex, to extort, manipulate, or just make their life a living hell. In the age of the Internet, it’s everyone’s worst nightmare, and TikTok and Bumble are trying to do something about it.

According to a report from Bloomberg, TikTok and Bumble are the latest internet entities to step up to the responsibility of making sure their users, and their users’ photos, are safe from abuse. Considering that TikTok reported hitting a billion global users last fall and that Bumble was the second most downloaded dating app in 2022, it’s about damn time.

Advertisement

“Bumble cares deeply about providing people, and particularly women, with the tools they need to feel empowered online, said Lisa Roman, Bumble’s VP of Public Policy, in an email to Gizmodo. “Our participation in StopNCII.org is another important step in this direction.”

“Our goal at TikTok is to foster a safe and supportive environment for our community, and there’s no place for this kind of malicious behavior or content on our platform,” said TikTok Head of Product Policy Julie de Bailliencourt in a separate email to Gizmodo. “We’re proud to partner with StopNCII.org to strengthen efforts to stop the spread of non-consensual intimate imagery and better support victims.”

Advertisement
Advertisement

TikTok and Bumble will now, apparently, track and block any images that are submitted to Stop Non-Consensual Intimate Image Abuse’s website, StopNCII.org. StopNCII will convert a sensitive image into a digital fingerprint, called a hash, without the image ever having to leave the victim’s phone. The hashes will then be shared with TikTok and Bumble, who will scan for the hash on their platforms and pull it if it’s found.

This workflow was first pioneered by Facebook, now Meta, who began testing a similar program in Australia in 2017. Since then, Instagram and Facebook have been running a souped-up version of the tracking system that has aided a reported 12,000 users and found 40,000 photos and videos. Meta also recently announced a more nuanced version of the reporting tool for minors in the company’s efforts to protect underage users from “suspicious adults.”

Revenge porn affects an estimated 1 in 12 U.S. adults, but social media platforms framing their efforts as a win fails to tackling the way they have developed and fostered digital infrastructure, whether intentionally or not, where sharing intimate images against the will of those in them is easy. TikTok, Meta, and even Bumble are platforms that are just now understanding the need to better police the digital environments that they helped create.

Updated December 1 12:43 p.m. ET: This article was updated to include quotes from Bumble and TikTok.