Facebook Moderator Says That 'Wellness Coaches' Advise Karaoke and Painting for Traumatized Workers

Illustration for article titled Facebook Moderator Says That 'Wellness Coaches' Advise Karaoke and Painting for Traumatized Workers
Photo: Drew Angerer (Getty Images)

The Irish Parliament today held a hearing on Facebook’s treatment of subcontracted content moderators—the thousands of people up to their eyeballs in toxic waste in the company basement. Moderators have repeatedly reported over the years that their contract companies hurl them into traumatizing work with little coaching or mental health support, in a system designed to stifle speech.

Advertisement

During the hearing, 26-year-old content moderator Isabella Plunkett said that Facebook’s (or the outsourcing firm Covalen’s) mental health infrastructure is practically non-existent. “To help us cope, they offer ‘wellness coaches,’” Plunkett said. “These people mean well, but they’re not doctors. They suggest karaoke or painting – but you don’t always feel like singing, frankly, after you’ve seen someone battered to bits.” Plunkett added that she’d gotten a referral to the company doctor and never heard back about a follow-up. She also reported that moderators are told to limit exposure to child abuse and self-harm to two hours per day, “but that isn’t happening.”

Content moderation requires that workers internalize a torrent of horror. In 2017, a moderator told the Guardian:

There was literally nothing enjoyable about the job. You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.

Last year, Facebook paid out an inconsequential $52 million to contractors in a class-action lawsuit filed by a group of moderators suffering from PTSD after exposed to child sexual abuse material, bestiality, beheadings, suicide, rape, torture, and murder. According to a 2019 Verge report on Phoenix-based moderators, self-medicating drug use at work was common at the outsourcing firm Cognizant.

Anecdotally, moderators have repeatedly reported a steep turnover rate; a dozen moderators told the Wall Street Journal that their colleagues typically quit after a few months to a year.

Plunkett has said that she was afraid to speak publicly, a common feeling among moderators. Foxglove, a non-profit advocacy group currently working to improve conditions for content moderators, said in a statement shared with Gizmodo that workers must sign NDAs of which they aren’t given copies. In 2019, The Intercept reported that the outsourcing company Accenture pressured “wellness coaches” in Austin, Texas to share details of their “trauma sessions” with moderators. The Verge also reported that Phoenix-based moderators constantly fear retribution by way of an Amazonian “point” system representing accuracy; employees can appeal demerits with Facebook, but their managers reportedly discouraged them from talking to Facebook, which sometimes reviewed their case only after they lost their jobs.

Foxglove told Gizmodo that Irish moderators claim the starting salary at Covalen is about 26-27,000 Euros, a little over $30,000 US dollars per year. Meanwhile, Facebook software engineers report on LinkedIn that their base salaries average $160,000 per year.

Advertisement

Facebook denied almost all of the above accounts in an email to Gizmodo. “Everyone who reviews content for Facebook goes through an in-depth training programme on our Community Standards and has access to psychological support to ensure their wellbeing,” a Facebook spokesperson said. “In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment.”

They also said that NDAs are necessary to protect users’ data, but it’s unclear why that would apply to speaking out about workplace conditions.

Advertisement

Covalen also denied Foxglove’s assertion that employees don’t receive copies of NDAs, saying that the confidentiality agreements are therefore archived and that HR “is more than happy to provide them with a copy.” They also said that they’re promoting a “speaking up policy,” encouraging employees to “raise [concerns] through identified channels.” So they can “speak out,” but internally, in designated places. They didn’t identify what happens when a moderator speaks out, only that they’ve “actively listened.” Technically, a wellness coach telling you to go to karaoke is listening, but it’s not providing any practical aid for post-traumatic stress.

Covalen also said that their “wellness coaches” are “highly qualified professionals” with at minimum master’s degrees in psychology, counseling, or psychotherapy. But it added that employees get access to six free psychotherapy sessions, implying that the 24/7 on-site “wellness coach” sessions are not actually psychotherapy sessions. Gizmodo has asked Facebook and Covalen for more specificity and will update the post if we hear back.

Advertisement

Given the unfortunate reality that Facebook needs moderators, the company could most obviously improve wellness by loosening up the pounding exposure to PTSD-inducing imagery. A 2020 report from NYU Stern pointed out that 15,000 people moderate content for Facebook and Instagram, which is woefully inadequate to keep track of three million posts flagged by users and AI per day. (When asked, Facebook did not confirm its current moderator count to Gizmodo.) The report cites Mark Zuckerberg’s 2018 statement on moderation, who put the number at two million; nonetheless, this would mean that at minimum 133 images flash before moderators’ eyes daily. According to The Verge, one moderator would review up to 400 pieces of content per day.

In her testimony, Foxglove co-founder and attorney Cori Crider pointed out that Facebook leans on moderators to keep the business running, yet they’re treated as “second-class citizens.” Crider urged Ireland’s Joint Committee on Enterprise, Trade, and Employment to regulate Facebook in order to end the culture of fear, bring contractors in-house, allow moderators to opt-out of reviewing harmful content, enforce independent oversight for exposure limits, and offer actual psychiatric resources.

Advertisement

The committee offered their sympathies and well-placed disgust.

“I would never want my son or daughter to do this work,” Senator Paul Gavan said. “I can’t imagine how horrific it must be. I want to state for the record that what’s happening here is absolutely appalling. This is the dark underbelly of our shiny multi-national social media companies.”

Advertisement

“It’s incredibly tough to hear,” Senator Garret Ahearn said, of Plunkett’s account. “I think chair it’s important that we do bring Facebook and these people in to be accountable for decisions that they make.”

We complain constantly that Facebook needs to do a better job of moderating. It also needs to do a better job of averting foreseeable calamity as it’s coming, rather than pay the lawyers and release the hounds later.

Advertisement

You can watch the full hearing here and Plunkett speak at a press conference here.

Staff reporter, Gizmodo. wkimball @ gizmodo

DISCUSSION

CaptainObvious7
CaptainObvious7

Well, what did they expect? Watching some ugliest internet shit and moderating it is not good for your mind. Not sure even psychiatrist will be helpful there? What are they going to do, pump them full of drugs?

There are probably people who can (or think they can) cope with this shit and wash it down in the evening with a bottle of vodka. But I am not sure about long term effect even for these tough guys.

It is like basically going into psychological warzone for entry level pay.

But then we put AI to work on it, and it leaves the ugliest shit online and bans breast cancer awareness campaigns because nipple.