It’s been three years since Google, Facebook, and other major tech platforms introduced ways for users to report and remove revenge porn, two years since a bill was introduced in Congress to criminalize the distribution of non-consensual pornography. And yet, advocates say the problem is just as bad, if not worse, for victims—and survivors of domestic violence most of all.
Organizations that work with abuse survivors say that their clients are increasingly dealing with digital security problems in addition to physical ones and that the threat of distributing revenge porn is a nearly ubiquitous problem in abusive relationships. As the plague of nonconsensual pornography spreads, other advocates say that they’re hoping the Me Too movement will be the catalyst that finally pushes large tech companies and Congress to act.
“I would have thought that this is the best moment for it to happen because we’ve had the Marines United scandal, we’ve had now the incredible Me Too movement, we’ve had basically so many different indications in our culture that now is the time to think about consent seriously,” says Mary Anne Franks, a professor at the University of Miami School of Law and vice president and legal director of the Cyber Civil Rights Initiative, a non-profit that assists victims of online abuses. “One important piece of that, of course, is consent to the exposure of our private information, especially our sexually explicit private information.”
“We really don’t see clients anymore that aren’t experiencing some sort of abuse that isn’t based in the use of technology,” says Stephanie Nilva, the executive director of the anti-domestic violence youth organization Day One. Digital abuse can include constant texting, geolocation tracking, and the release of revenge porn, she notes.
Technology can intensify harassment and stalking because it increases an abuser’s access to their target. Nilva says her organization, which provides education and remediation to people 24 and younger, often helps clients remove spyware or keyloggers from their phones, get revenge porn taken down from social media platforms, or removes sockpuppet accounts that purport to be the client and spread harmful content to their friends and family.
These forms of abuse have become so prevalent that organizations focused on digital abuse have sprung up to support traditional domestic violence groups. Erica Johnstone, who co-founded Without My Consent to specifically address online abuse, works with domestic violence organizations in California to help them address revenge porn. “The next wave of violence against people is using technology to commit that violence,” she says.
“It’s very extraordinarily common in all cases, really, but particularly with teens, when we screen for nonconsensual pornography, almost always they will tell us that they’ve been either threatened by their partner or the threat has been carried out, images have been shared or been threatened to be shared with somebody else or the general public at large as a means to exert power and control over them,” says Adam Dodge, legal director at Laura’s House, a domestic violence shelter in California, and public policy co-chair with the California Partnership to End Domestic Violence.
In many cases, victims of revenge porn “won’t even mention nonconsensual pornography because they will be focused on things like physical abuse or harassment or stalking or something of that nature,” Dodge says. Because of this, shelters have been screening incoming victims to find out if they’ve been subjected to this form of abuse.
“Because it’s so common to sext and exchange intimate photos,” he says, these images give abusive people “such extraordinary ability to wield power and control over somebody.”
‘Using a Pickaxe to Do Needlepoint’
When someone non-consensually releases photos of you online, your legal options are messy, an issue compounded by a patchwork of state laws that complicate the legal process. Major platforms like Facebook, Twitter, and Tumblr all ban nonconsensual porn, but these companies leave the responsibility of finding it and reporting it to victims.
Although most major technology companies publish transparency reports detailing the requests they receive for content takedowns, only Microsoft’s report includes revenge porn. Between January and June of last year, its most recent reporting period, Microsoft only took down content in response to 57 percent of the revenge porn reports it received. In prior periods, Microsoft claimed that most of the reports it chose not to take action on did not actually include nonconsensual pornography.
Some advocates encourage victims of revenge porn to file copyright takedowns, either by asserting that they own the copyright to an offending photo because they took it themselves, or by convincing the photographer to assign copyright to them. This tactic can prove successful. In the first half of 2017, Microsoft honored 99.65 percent of the requests it received for copyright takedowns.
Others find more success in going through family court and applying for restraining orders. Without My Consent advises victims to collect as much evidence as they can, regardless of the options they chose to pursue and offers a guide to doing so.
Victims can also pursue criminal cases against someone who posts images of them, but they often end up dealing with law enforcement officials who don’t understand the issue. There’s an extensive debate within the advocacy community about whether revenge porn should be addressed with law governing cyberbullying, harassment, copyright, or even free speech, and the right solution often depends on the victim’s specific circumstances.
“You have police and prosecutors who are operating with ancient tools,” Nilva says. “It’s like using a pickaxe to do needlepoint.”
The problems are compounded for victims who are underage. Minor victims are sometimes charged with distributing child pornography because they shared a photo of themselves. “Teens should just not create any of this content,” says Johnstone. “They can’t because it’s too high risk for them. It’s high risk for adults. But it’s really high risk when you could be labeled a child pornographer and sex offender for the rest of your life. I don’t think they understand what the stakes are. If they did, they would not be taking and sharing these photos.”
A Cultural Shift
To address these legal issues, Franks suggests looking at revenge porn as a privacy problem rather than a harassment problem. Some proposed legislation has included language about proving the harasser’s intent to cause harm, which could make cases difficult to prosecute.
“You know that medical data is private, you know that financial data is private, and it doesn’t turn on the question of whether or not the person meant to hurt your feelings by using it. It doesn’t matter and you should recognize the same thing in this context,” Franks explains. She encourages lawmakers to consider First Amendment and civil rights issues when crafting legislation.
“If we care about people’s freedom to express themselves we have to understand that revenge porn is a First Amendment matter from the victim’s perspective,” she says. “If revenge porn goes unpunished, women—especially because they tend to be targeted disproportionately—do not get to speak.”
Tech companies also need to step up, advocates say, and be more responsive to victims and the organizations that work with them. “I think the tech companies need to be working in much closer partnership with domestic violence organizations to really understand,” Nilva says. Day One also advocates for mandatory preventative education in schools that includes information about technology and the abuse it can enable.
Facebook is experimenting with a pilot program that allows people to preemptively submit photos of themselves before they’re posted for revenge. The social network will store hashes of the photos and prevent them from ever being uploaded to the platform in the first place.
“Hate speech policies and privacy policies can do a little bit and they will be helpful in the short term,” says Franks. “But if we’re actually going to put a stop to this type of behavior, we’re going to have to have a cultural shift that will actually force us to ask ourselves: Do we think the kinds of things that Twitter makes possible or Facebook makes possible or Google makes possible are actually worth it in the end, given how much harm they can cause?”
Additional reporting by Andrew Couts
This story was produced with support from the Mozilla Foundation as part of its mission to educate individuals about their security and privacy on the internet.