On Wednesday, Philando Castile, a 32-year-old Minnesota man, was shot by police during a traffic stop. The aftermath of the fatal shooting was captured live on video and posted to Facebook by a woman identified as Castile’s girlfriend. Soon after it went viral, the footage, which depicts Castile, slumped over in his seat and covered in blood, as well as the woman’s four-year-old daughter, disappeared. After about an hour—and complaints from various people on and off of the platform—Facebook restored the footage and blamed it on a “technical glitch.”
Facebook has shown that it gets things wrong fairly regularly when it comes to what it takes down and what it leaves up, but its excuse this time firmly blames it on technology rather than its content moderation team.
“We’re very sorry that the video was temporarily inaccessible. It was down due to a technical glitch and restored as soon as we were able to investigate,” a company spokesperson said in an email to Gizmodo.
Facebook’s wording is notably different from previous statements in which the company apologized for wrongly removing things. Here’s the statement it provided when it removed a meme criticizing Stanford rapist Brock Turner:
“This content was removed in error, and we are currently working to restore it. Our team processes millions of reports each week, and we sometimes get things wrong. We’re very sorry about this mistake.”
Besides the Brock Turner meme, the company has also apologized for removing a widely shared post displaying solidarity in the wake of the Orlando shooting and a photo of a naked father with his son.
Facebook is right in that it deals with millions upon millions of posts, and that mistakes happen. But it’s unclear at this point in time what the circumstances surrounding the Philando Castile video glitch were, and how this “glitch” differs from previous “error[s].” The company didn’t respond to Gizmodo’s query asking for clarification.
As Motherboard noted earlier today, how Facebook deals with this content is vital. It’s used as a news source by millions of people, and with that kind of audience, it can’t afford to screw up and then offer the vague, unconvincing explanation of a “technical glitch.” The company has made it clear that it wants to become a destination for news; it’s only natural that users will wonder why certain pieces of content are allowed while others mysteriously disappear.
Again, this isn’t new. Facebook has consistently come under fire for its moderation tactics—removing a photo depicting childbirth, a trailer featuring photos of topless Aboriginal women, and even a photo of a bronze statue of the Little Mermaid, yet letting photos of a murdered woman remain accessible for 36 hours.
With this purported explanation of technical difficulties, Facebook appears to be testing out a new blame-game narrative. Ultimately, however, it’s more of the same—a relatively meaningless explanation for continual fuck ups.