There’s been a lot of discussion over the ethics of posting violent livestreams on Facebook, and the social media site has decided to release subjective guidelines: if a person posts violent content to “raise awareness,” the video can stay. If someone shared the same video to mock the victim, it will be removed.
This means that videos like that of the shooting of Philando Castile, a 32-year-old black man from Minnesota, will stay up so long as people post solely about the injustice of the incident.
Now, Facebook gets to call all the shots, despite a content moderation team that has often gotten things wrong in the past. The team took down a meme that people posted to “raise awareness” of Stanford rapist Brock Turner, as well as a photo of a famous “Little Mermaid” statue that apparently violated nudity guidelines. Yet it left up a graphic photo of a dead woman who was allegedly stabbed by her boyfriend.
The problems (and potential controversies) surrounding content moderation will become even more delicate when it’s not just nudity that get moderated, but videos that show death in real time. With Facebook becoming a key source for news and Facebook Live rapidly becoming a tool to broadcast and publicize these brutal incidents, the judgment calls it makes on these videos are going to be more fraught than ever.
[Disclosure: Facebook has launched a program that pays publishers, including the New York Times and Buzzfeed, to produce videos for its Facebook Live tool. Gawker Media, Gizmodo’s parent company, recently joined that program.]