New York’s Attorney General Letitia James wants to criminalize online homicide videos and see revisions to Section 230 of the Communications Decency Act to hold tech platforms liable for propagating violent livestreams.
James hopes those proposals, which would almost certainly face legal scrutiny, could potentially help prevent a repeat of the racially motivated Buffalo, New York supermarket shooting that left 10 people dead. Rather than a one off case of evil, James said the Buffalo shooter was, “part of an epidemic of mass shootings often perpetrated by young men radicalized online.”
Those proposals were part of a 49 page report released this week by the Attorney General’s office following a months-long investigation into what role social media and other web platforms played in the Buffalo shooting. In the report, James specifically singles out livestreaming platforms, which she says have “become a tool of mass shooters to instantaneously publicize their crimes.” These livestreams of shootings, James argues, amounts to “an extension of the original criminal act.” The Buffalo shooting occurred three years after another killer similarity streamed his horrific rampage at a pair of mosques in Christchurch, New Zealand leaving 51 people dead. That shooter’s platform of choice was Facebook Live.
Twitch—where the Buffalo shooting originally streamed— removed the video in under two minutes, but other more fringe platforms took much longer to act. Moving forward, James recommended new restrictions on live streaming including verification requirements for streamers, a restriction on algorithmic promotion of livestreams, and a so-called “tape delay” for users who fail to meet certain trust criteria.
“The future of livestreaming needs to grapple with how this service has been used to broadcast these acts of terror, becoming an extension of the criminal act, further terrorizing the targeted community and serving to promote the shooter’s ideology,” the report reads.
Though the report acknowledges improvements in platforms’ response times since the Christchurch massacre, it goes on to say even a few minutes of inaction offers a window to potentially spread a terrorist’s message and inspire future shooters. Complicating things, alternative “fringe” platforms like 4Chan, which aren’t tied to the same straight laced image as Twitch, are less inclined to proactively remove this type of extremely violent content, the report notes, primarily because not doing so isn’t actually illegal.
James wants to change that. In the recommendations section of the report, the AG suggests officially criminalizing videos of a homicide, and adding new civil penalties for the distribution and transmission of that content. Those penalties could include new liabilities for platforms that “fail to take reasonable steps to prevent unlawful violent criminal content from appearing on the platform.” The New York AG pointed to the tech industry’s efforts to remove child sexual abuse material as a potential model for how it could stem the tide of extremely violent material.
The report quickly reins those recommendations back a tad for all you First Amendment scholars out there. For starters, the AG report says any laws criminalizing homicide videos should avoid issuing penalties for videos with historical, educational, or societal benefits. Penalties also shouldn’t apply to bystanders filming a murder or police body camera footage.
Whether you agree with them or not, James’ calls to hold tech firms liable for hosting violent comments appear destined to clash with Section 230 of the Communications Decency Act. In short, that provision shields firms from being held legally liable for content their users upload while also granting them the ability to moderate their platforms as they see fit.
Acknowledging that impasse, James called on members of Congress to revise Section 230 to hold platforms accountable. That sentiment is shared by a variety of lawmakers ranging from Democratic Senator Amy Klobuchar to President Joe Biden and even former President Donald Trump. At the same time, broad coalitions of First Amendment scholars and digital rights activists have long opposed efforts to weaken what’s considered the bedrock of internet policies.
“In practice, creating additional hoops for platforms to jump through in order to maintain their Section 230 protections would almost certainly result in fewer opportunities to share controversial opinions online, not more,” The Electronic Frontier Foundation wrote back in 2018. “Under Section 230, platforms devoted to niche interests and minority views can thrive.”
The report expands further beyond livestreams, however and draws a direct connection between shootings and social media. James says it’s “hard to ignore the correlation” between recent rises in mass shootings and the “prevalence of online platforms where racist ideology and hate speech flourish, in some cases by design.”
“The tragic shooting in Buffalo exposed the real dangers of unmoderated online platforms that have become breeding grounds for white supremacy,” James said in a statement. “Extremist content is flourishing online, and we must all work together to confront this crisis and protect our children and communities.”
Gun violence and senseless shootings have left many in the U.S. rattled. Recent polling conducted by the Pew Research Center found nearly a third of K-12 parents said they were either extremely or very afraid of a shooting occurring at their child’s school. A majority (63%) of parents surveyed said they thought improving mental health screening and treatment would be very or extremely effective at preventing shootings. That scenario garnered stronger support than calls to add police officers in schools or ban assault style weapons.
Those fears around gun violence generally could play an important role in the upcoming 2022 midterm elections. In a separate Pew poll from August, registered U.S. voters cited gun policy as a top voting issue above all others issues except the economy.