On Wednesday, the Senate approved a bill that on its face is intended to curb online sex trafficking. However, experts say it will increase online censorship, stifle innovation, and make everyone less safe online. The House had already passed a version of the bill and it will likely become law in a short time. Now, all that’s left is a signature from President Trump.
The hapless members of the US Congress are good at piecemealing together legislation that makes it look like they’ve done something but really just confuses the issue. Case in point, the Stop Enabling Sex Traffickers Act of 2017 (SESTA) has the kind of name that makes it nearly impossible to oppose in public, despite the fact that it could lead to more enabling of sex trafficking online. Originally proposed by Senator Rob Portman, it’s gone through some changes and has been adapted into the House’s Fight Online Sex Trafficking Act (FOSTA) bill that passed in February. There a lot of little details and minor differences to discuss between each version of what the EFF has called a “Frankenstein’s Monster of a bill.” For the sake of clarity, let’s just call it all SESTA since the criticisms are mostly consistent between the House and Senate versions of the legislation.
SESTA was prompted by a case involving Backpage.com, in which executives of the site were arrested on charges of pimping a minor, pimping, and conspiracy to commit pimping. Backpage’s “adult” section was allegedly used for prostitution, however courts dismissed the case based on Section 230 of the Communications Decency Act. That little section of the law is one of the most important reasons we have the internet that we do today. It allows companies to avoid most legal liability for content created by others. It’s a big reason social media networks can exist, and it’s also important for messaging, email, comments, and numerous other online services.
A Senate committee report found in 2016 that Backpage is likely not eligible for immunity under Section 230 and the company still faces a Justice Department investigation and a renewed lawsuit from sex trafficking victims. But the wheels of justice turn slowly, and at least two powerful factions have supporte token legislation that appears to be doing something about the problem of bad content online: Lawmakers and tech giants.
SESTA weakens Section 230 in an effort to give sex trafficking victims greater ability to sue websites and state prosecutors the ability to hold companies criminally liable for user-generated content. Its critics say SESTA will lead some platforms to over-censor content with heavy-handed algorithms because it’s just too expensive to police it with humans and algorithms are bad at nuance. On the flip-side, many companies might abandon moderation altogether in order to ensure that it can never be proven they “knowingly” facilitated the offending material.
Lawmakers obviously want to chalk up a win against internet companies as more issues arise around online activity. But tech companies initially opposed SESTA. The Internet Association, which counts Amazon, Google, Facebook, Microsoft, and Twitter among its members, protested the first version of SESTA. But when an amended version appeared last November the group decided that it could live with the bill. Only some minor legal language was changed—such as the phrase “by any means” being removed from the following line: “The term ‘participation in a venture’ means knowing conduct by an individual or entity, by any means, that assists, supports, or facilitates a violation.”
But advocates like the Electronic Freedom Foundation said the changes weren’t enough. In a blog post, an EFF lawyer wrote:
As we explained [before], the words “assist, support, or facilitate” are extremely vague and broad. Courts have interpreted “facilitate” in the criminal context simply to mean “to make easier or less difficult.” A huge swath of innocuous intermediary products and services would fall within these newly prohibited activities, given that online platforms by their very nature make communicating and publishing “easier or less difficult.”
The biggest Silicon Valley companies, which seem to have ultimately accepted this legislation as inevitable, have an advantage that smaller companies won’t. They’re able to turn to high-priced lawyers to stave off the lawsuits that will likely ensue. They’re also capable of implementing complex algorithms to monitor their services that a little guy wouldn’t necessarily have access to. This is how SESTA could stifle innovation: The pseudo-monopolies will survive, but new competitors could be priced out or discouraged from entering the arena. Longtime internet activist Mike Godwin used this analogy:
They’d likely face the hard choice of either supercensorship (yank anything users say or post that seems even remotely likely to pose legal risk) or just abandoning the startup project altogether. (Internet-law experts refer to this as “the moderator’s dilemma.”) A would-be Facebook killer wouldn’t be able to compete by being a better Facebook—the best it could aim for is to be a better Prodigy. Prodigy, the original “walled garden” of online services, was an early competitor among online companies, with its forums highly moderated by its staff—you couldn’t post your content publicly on the service without subjecting your postings to screening by Prodigy editors. Unsurprisingly, Prodigy in its original form didn’t do well in the long run competing initially with more open, less moderated services like AOL and CompuServe or, ultimately, with the offerings of the wide-open internet itself.
Facebook is also a good example to use when imagining how big tech might respond to SESTA. The social network is currently embroiled in a data-sharing scandal that is opening the public’s eyes to the fact that the service is a privacy nightmare. One whistleblower told the Guardian that Facebook avoided scrutinizing or auditing how third-parties handled user data because it didn’t want to know. Privacy and security are separate issues from SESTA, but it’s easy to see that Facebook would likely go the route of using extreme filters to avoid human judgment opening it up to any liability.
This means that not only would everyone’s speech likely fall victim to the occasional robo-monitoring ax, but it would be harder for the victims of sex trafficking to use these powerful platforms to reach out for help, tell their story, or bring a spotlight to an abuser. An algorithm could easily confuse the language of a victim and a sex trafficker. As Alex Levy, a professor at Notre Dame Law School who teaches Human Trafficking and Human Markets, wrote last year:
The war on Internet platforms is pageantry: a kind of theater designed to satisfy people’s need to identify and fight bad guys without regard to nuance or long-term outcome. But from a tactical standpoint, it is more than a distraction. Censoring these platforms means forfeiting a resource that naturally facilitates the recovery of victims...
Section 230 doesn’t cause lawlessness. Rather, it creates a space in which many things — including lawless behavior — come to light. And it’s in that light that multitudes of organizations and people have taken proactive steps to usher victims to safety and apprehend their abusers.
Critics have also argued that sex workers just trying to make a living could be further criminalized and forced out into the streets where violence, drugs, and health risks are more of a problem.
There’s little doubt that online platforms are being used for hate speech, terrorist recruiting, harassment, revenge porn, election meddling, and many other crimes. But there are already legal tools available for prosecuting the worst offenders, and Backpage executives could still face consequences. This legislation will too easily force everyday users, startups, and victims to pay the price. And with Section 203 weakened, there’s no telling what can of worms might be opened in the future.