Facebook Moderator Says That ‘Wellness Coaches’ Advise Karaoke and Painting for Traumatized Workers


Illustration for article titled Facebook Moderator Says That 'Wellness Coaches' Advise Karaoke and Painting for Traumatized Workers

Photo: Drew Angerer (Getty Images)

The Irish Parliament today held a hearing on Facebooks treatment of subcontracted content moderatorsthe thousands of people up to their eyeballs in toxic waste in the company basement. Moderators have repeatedly reported over the years that their contract companies hurl them into traumatizing work with little coaching or mental health support, in a system designed to stifle speech.

During the hearing, 26-year-old content moderator Isabella Plunkett said that Facebooks (or the outsourcing firm Covalens) mental health infrastructure is practically non-existent. To help us cope, they offer wellness coaches, Plunkett said. These people mean well, but theyre not doctors. They suggest karaoke or painting but you dont always feel like singing, frankly, after youve seen someone battered to bits. Plunkett added that shed gotten a referral to the company doctor and never heard back about a follow-up. She also reported that moderators are told to limit exposure to child abuse and self-harm to two hours per day, but that isnt happening.

Content moderation requires that workers internalize a torrent of horror. In 2017, a moderator told the Guardian:

There was literally nothing enjoyable about the job. Youd go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, thats what you see. Heads being cut off.

Last year, Facebook paid out an inconsequential $52 million to contractors in a class-action lawsuit filed by a group of moderators suffering from PTSD after exposed to child sexual abuse material, bestiality, beheadings, suicide, rape, torture, and murder. According to a 2019 Verge report on Phoenix-based moderators, self-medicating drug use at work was common at the outsourcing firm Cognizant.

Anecdotally, moderators have repeatedly reported a steep turnover rate; a dozen moderators told the Wall Street Journal that their colleagues typically quit after a few months to a year.

Plunkett has said that she was afraid to speak publicly, a common feeling among moderators. Foxglove, a non-profit advocacy group currently working to improve conditions for content moderators, said in a statement shared with Gizmodo that workers must sign NDAs of which they arent given copies. In 2019, The Intercept reported that the outsourcing company Accenture pressured wellness coaches in Austin, Texas to share details of their trauma sessions with moderators. The Verge also reported that Phoenix-based moderators constantly fear retribution by way of an Amazonian point system representing accuracy; employees can appeal demerits with Facebook, but their managers reportedly discouraged them from talking to Facebook, which sometimes reviewed their case only after they lost their jobs.

Foxglove told Gizmodo that Irish moderators claim the starting salary at Covalen is about 26-27,000 Euros, a little over $30,000 US dollars per year. Meanwhile, Facebook software engineers report on LinkedIn that their base salaries average $160,000 per year.

Facebook denied almost all of the above accounts in an email to Gizmodo. Everyone who reviews content for Facebook goes through an in-depth training programme on our Community Standards and has access to psychological support to ensure their wellbeing, a Facebook spokesperson said. In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment.

They also said that NDAs are necessary to protect users data, but its unclear why that would apply to speaking out about workplace conditions.

Covalen also denied Foxgloves assertion that employees dont receive copies of NDAs, saying that the confidentiality agreements are therefore archived and that HR is more than happy to provide them with a copy. They also said that theyre promoting a speaking up policy, encouraging employees to raise [concerns] through identified channels. So they can speak out, but internally, in designated places. They didnt identify what happens when a moderator speaks out, only that theyve actively listened. Technically, a wellness coach telling you to go to karaoke is listening, but its not providing any practical aid for post-traumatic stress.

Covalen also said that their wellness coaches are highly qualified professionals with at minimum masters degrees in psychology, counseling, or psychotherapy. But it added that employees get access to six free psychotherapy sessions, implying that the 24/7 on-site wellness coach sessions are not actually psychotherapy sessions. Gizmodo has asked Facebook and Covalen for more specificity and will update the post if we hear back.

Given the unfortunate reality that Facebook needs moderators, the company could most obviously improve wellness by loosening up the pounding exposure to PTSD-inducing imagery. A 2020 report from NYU Stern pointed out that 15,000 people moderate content for Facebook and Instagram, which is woefully inadequate to keep track of three million posts flagged by users and AI per day. (When asked, Facebook did not confirm its current moderator count to Gizmodo.) The report cites Mark Zuckerbergs 2018 statement on moderation, who put the number at two million; nonetheless, this would mean that at minimum 133 images flash before moderators eyes daily. According to The Verge, one moderator would review up to 400 pieces of content per day.

In her testimony, Foxglove co-founder and attorney Cori Crider pointed out that Facebook leans on moderators to keep the business running, yet theyre treated as second-class citizens. Crider urged Irelands Joint Committee on Enterprise, Trade, and Employment to regulate Facebook in order to end the culture of fear, bring contractors in-house, allow moderators to opt-out of reviewing harmful content, enforce independent oversight for exposure limits, and offer actual psychiatric resources.

The committee offered their sympathies and well-placed disgust.

I would never want my son or daughter to do this work, Senator Paul Gavan said. I cant imagine how horrific it must be. I want to state for the record that whats happening here is absolutely appalling. This is the dark underbelly of our shiny multi-national social media companies.

Its incredibly tough to hear, Senator Garret Ahearn said, of Plunketts account. I think chair its important that we do bring Facebook and these people in to be accountable for decisions that they make.

We complain constantly that Facebook needs to do a better job of moderating. It also needs to do a better job of averting foreseeable calamity as its coming, rather than pay the lawyers and release the hounds later.

You can watch the full hearing here and Plunkett speak at a press conference here.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *