On Tuesday, Amnesty International released its findings from a 16-month investigation into the experiences women face on Twitter—namely, online abuse. The 77-page report, #ToxicTwitter: Violence and abuse against women online, illustrates Twitter’s failures to protect its most vulnerable users and to be transparent about those failures. The report is being released 12 years after the first tweet, which was sent by Twitter co-founder and current CEO Jack Dorsey.
“The main point we keep coming back to time and again is Twitter’s failure to consistently and adequately implement its own policies prohibiting violence and abuse on the platform,” Amnesty International researcher Azmina Dhrodia, the report’s author, told Gizmodo in an email. “Despite repeated attempts to improve their policies and practices and improve the experience for all users, which are welcome, as a company Twitter has still not taken sufficient steps to effectively and adequately address reports of violence and abuse and implement its own guidelines.”
Amnesty International conducted both quantitative and qualitative research from December 2016 through March of this year. The human rights organization interviewed 86 women and non-binary individuals across the US and UK, conducted a qualitative survey early last year that yielded 162 responses, and had a data scientist use machine learning to trawl for abusive tweets toward female Members of Parliament. According to the analysis, between January 1st and June 8th of last year, 25,688 tweets were abusive out of 900,223 detected and analyzed. Amnesty also conducted an online poll in November of last year in eight countries, which included the US and the UK, with a survey sample nationally representative of women in each of those countries.
“The research highlights the particular experiences of violence and abuse on Twitter against women of colour, women from ethnic or religious minorities, lesbian, bisexual or transgender women, non-binary individuals, and women with disabilities, to demonstrate the intersectional nature of abuse on the platform,” Amnesty wrote in the report.
While Twitter’s abuse problems are well-documented, Dhrodia says that what still stands out is the social network’s lack of transparency in how it addresses those problems. Last year, Twitter’s general manager of consumer product and engineering, Ed Ho, wrote in a blog post that the company was “taking action” on 10 times the number of abusive accounts each day than it was at that time the year before. “While this sounds good on paper, it’s meaningless without knowing the baseline data behind it,” Dhrodia said.
A few recommendations Amnesty had for Twitter included making disaggregated data about harassment reports available to the public, with information on how many of these reports it receives per year, how many of them the company finds to be in violation of its guidelines, and the average amount of time it takes to respond to these reports. Amnesty published its correspondence with Twitter in the report, which includes the aforementioned suggestions. According to the report, Twitter responded by saying that sharing this data “is not informative” since users often abuse the reporting tools.
“We absolutely agree that context is important when sharing any raw data, but their argument does not hold because there is nothing stopping them from providing that context alongside whatever data they share,” Dhrodia said.
Amnesty notes in the report that as of March 16th of this year, the organization has met in-person with Twitter’s legal and public policy experts three times—in Washington, DC, in May of last year, in San Francisco in February, and in London in March. Amnesty says it has also communicated with Twitter over the phone, via email, and by mail. Twitter responded to the organization’s concerns “but refused to provide the data requested about the reporting process and content moderation,” according to the report. Amnesty says Twitter continued to deny these requests during its in-person meetings this year.
“The assertion that Twitter is consciously unengaged with human rights issues is an unfair representation not just of the facts, but of the ethos of our dedicated teams, and the core mission of the company,” Vijaya Gadde, Twitter’s Legal, Policy, & Trust & Safety Lead, said in a statement. She added that the company agrees with many of the report’s recommendations and that it is already working on some of them, noting that the company has already made over 30 changes over the last 16 months and increased its action rates by ten.
“We have made significant changes to our reporting tools and continue to improve them as well working to communicate more clearly with our users on reports and how we draft policy,” she said. “We continue to expand our Transparency Report to include relevant and meaningful data. We have seen extraordinary engagement supporting women. The rise of movements like #MeToo, #WomensMarch, and #PositionOfStrength are testimonies to the power of Twitter as a platform for women and their allies to share stories, offer support, and advocate for change.”
In March, Twitter wrote in a letter to Amnesty that it “cannot delete hatred and prejudice from society.” Dhrodia views this as a way for Twitter to downplay its responsibilities by instead doubling down on society’s wider issues. “We are not asking them to solve the world’s problems,” she said. “We are asking them to take concrete steps to transparently show that they are dealing with the problem of online violence and abuse against women on their own platform.”