Reddit’s facing a class-action suit from a woman alleging the platform did little to stop a stream of revenge porn—featuring her as a minor—from being uploaded countless times. The Jane Doe behind the suit is pursuing the class action on behalf of anyone whose own illicit (and underage) content was allowed to surge across the platform over the past decade.
The suit alleges that from 2019 through today, the plaintiff’s now ex-boyfriend used 36 different subreddits to distribute lewd pictures and videos featuring her as a 16-year-old. Per the suit, Doe spent “hours” flagging the content across more than two dozen subreddits—both because it was uploaded without her consent, and because it featured a minor. Even after flagging it multiple times, she alleges that Reddit would sometimes wait “several days” to take down the content.
When the ex eventually got banned over this behavior, he simply started a new account and began posting the same content anew, “often to the exact same subreddit,” the suit explains.
“Without Reddit’s assistance the situation is hopeless,” the suit reads. “The circulation of the videos and images, and the effort she has had to undertake to both locate them and negotiate with Reddit to have them removed, has caused Ms. Doe great anxiety, distress and sleeplessness [...] resulting in withdrawing from school and seeking therapy.”
All of this, the plaintiff claims, violates FOSTA-SESTA, a controversial internet bill that was meant to curb “online sex trafficking,” but by all accounts did anything but. In a nutshell, the law amends Section 230 of the Communications Decency Act, stripping certain protections in cases where platforms like Reddit are found hosting sex trafficking-related content. In this case, the suit claims that Reddit ran afoul of FOSTA-SESTA when it ran ads alongside this woman’s content, turning those clips into “commercial sex acts” that Reddit “knowingly” profited from when it refused to take them down.
In response to Gizmodo’s request for comment, a Reddit spokesperson told us that content featuring child sexual abuse “has no place on the Reddit platform.”
“We actively maintain policies and procedures that don’t just follow the law, but go above and beyond it,” the spokesperson said, adding that the platform uses “both automated tools and human intelligence” to keep this sort of content from surfacing. But as we’ve seen in the past, sometimes automation just isn’t enough to catch the worst that these sites have to offer.