Photo: AP

Soon after Mike Schroepfer was promoted to Facebook CTO, he began recruiting top neural network researchers to build up the company’s AI capabilities. At the time, he and CEO Mark Zuckerberg wanted to use machine learning to focus on areas like facial recognition, targeted ads, language translation, and chatbots.

But according to a New York Times report on the state of Facebook’s AI efforts, the company’s AI research and Schroepfer’s role began to shift sometime in 2015. After the Paris terrorist attack, when 130 people were killed, Zuckerberg reportedly enlisted Facebook’s Applied Machine Learning team to find ways to address terrorism propaganda. Since then Facebook has only become more toxic, and the world has become more aware of the threats it poses to society and democracy.

Advertisement

By the middle of 2017, the team was primarily focused on keeping “toxic content” off of the site.

Cleansing Facebook of toxic content is an impossible task, but it’s one the company created when it opened the sewage floodgates with many decisions like becoming a de facto publishing company and creating a livestream platform that allows people to share in a “raw personal spontaneous way.” It’s a problem that’s exacerbated by Facebook’s insistence on being as big as possible and everything to everyone. It’s a problem that will never, ever go away.

And now, the CTO “is in a position he never wanted to be in,” according to the Times. He set out to build one of Silicon Valley’s top AI labs. “But along the way, his role evolved into one of threat removal and toxic content eliminator,” the Times states.

Advertisement

It’s like if you were hired to create a dream team to figure out how to transform your company and prepare it for the future—but instead, you and your team took over janitor duties. And instead of dealing with human waste you were cleaning the dredge of humanity as it is destroying the fabric of society.

That roll seems to have taken a toll on the CTO.

As Schroepfer was trying to explain the challenges of purging toxic waste from Facebook, a reporter mentioned the livestream video that the Christchurch shooter posted on Facebook during his attack on two mosques that killed 51. The original video was on Facebook for about an hour, but the footage quickly spread throughout the site. After being asked about the incident, “Schroepfer went quiet” and “his eyes began to glisten,” according to the Times.

Advertisement

It took a minute for Schroepfer to respond. He attempted “to remain composed” as he said his team is working on it. “It won’t be fixed tomorrow. But I do not want to have this conversation again six months from now. We can do a much, much better job of catching this,” Schroepfer told the Time.

The Times assessed that Facebook granted the newspaper access to Schroepfer to show that AI is helping the company clean the site, while also “humanizing its executives.”

That humanization seems to have involved weeping. As the Times states:

In two of the interviews, he started with an optimistic message that A.I. could be the solution, before becoming emotional. At one point, he said coming to work had sometimes become a struggle. Each time, he choked up when discussing the scale of the issues that Facebook was confronting and his responsibilities in changing them.

Advertisement

Apparently, people who work with Schroepfer are aware of the toll that his job has taken on him.

“I don’t think I’m speaking out of turn to say that I’ve seen [Schroepfer] cry at work,” Jocelyn Goldfein, a Zetta Venture Partners venture capitalist who worked with Schroepfer at Facebook, told the Times.

Almost makes you feel bad for the guy. Until you think about the Cambridge Analytica scandal, how Facebook often shares its users’ data without telling users it’s doing so, and how the platform played a crucial role in the Myanmar Genocide and Russia interfering with the U.S. presidential election.

Advertisement

And of course, it’s not just people behind the AI who are dealing with this problem. There are others who have been working to clean the toxicity—the moderators overseas and in the US who are reportedly suffering from PTSD from the horrors they see. Schroepfer will be just fine, folks.