Last week, a manipulated video of Nancy Pelosi went viral, slowed down so the House Speaker’s words were slurred, giving off an appearance of intoxication. Facebook left the video up, albeit alongside a message indicating the video is manipulated and lowering its reach. On Wednesday, Pelosi made her feelings about the social network known in an interview with KQED News. “I think [Facebook] have proven — by not taking down something they know is false — that they were willing enablers of the Russian interference in our election,” she said.
In the interview, Pelosi doubled down on her belief that Facebook wasn’t a blameless party during the foreign election interference in 2016, pointing to the fact that they kept the falsified video of her up on the platform even though they knew it was manipulated. By the time Facebook had added the related article and limited the video’s reach, it had about 2.5 million views.
“Because right now they’re willing to put something on it that they know to be false, and I think it’s wrong,” Pelosi told KQED. “Again, I can take it. But who wants to get into the arena? If something like Facebook, with all the power that they have, would take something that they know is false? They’re lying to the public.”
According to a report from MarketWatch, two days before the falsified Pelosi video made the rounds, Facebook had an internal policy meeting to discuss whether the company needed a new policy that would address such manipulated media. There were reportedly about 60 executives and employees at the meeting on May 21, and staffers proposed to those in attendance a consideration for a policy that would extend to “threats presented by emerging technologies,” MarketWatch reported, bringing up deepfakes. The staffers reportedly said during the meeting that fake photos and videos can be a catalyst for political disruption.
But the Pelosi video wasn’t a deepfake—a technique that uses machine learning to create an ultrarealistic fake video—it was simply a video that was slowed down by about 25 percent with an adjustment to her pitch. A policy specifically around deepfakes wouldn’t even apply to the Pelosi video, and it’s unclear whether one around “emerging technologies” would either for altered media using existing simple tools.
The policy, and the question around whether such altered content should exist on the platform at all, is a nuanced one. What is clear is that when faced with the knowledge that a confirmed fake video is going viral on its platform, one that could certainly cause political disruption (both President Trump and Rudy Giuliani signal boosted the Pelosi video), Facebook decided to let it stay up. And Pelosi is casting doubt on the company’s innocence, pointing out their complicity in planting the seeds for past and current public manipulation.
We have reached out to Facebook on whether the company will update or add a new policy around manipulated media on its platform and will update this post when we receive a reply.