EU The horrors experienced by Meta moderators: ‘I didn’t know what humans are capable of’ - A court ruling in Spain that attributes a content moderator’s mental health problems to the job paves the way for acknowledging that fact for at least 25 other employees as well


1706628515306.png

The Glòries tower in Barcelona, Spain, where Meta's subcontractor for content filtering is located.

After having to repeatedly watch numerous videos of suicides, murders, dismemberment and rapes, he had a panic attack and asked for help. This employee worked as a content moderator at a company that serves Meta, which owns Facebook, Instagram and WhatsApp. He was told to go to the fun floor: a large game room located in Barcelona’s Glòries tower, where the content moderation offices for the Californian tech giant are located. He sat staring blankly at a ping pong table. The fun floor didn’t help him at all. On another occasion, two hours after he had another panic attack, his boss gave him permission to go see a company psychologist. She was on another floor, the psychology floor. He spent over half an hour talking to her, getting it all out. When he finished, she told him that his work was very important to society, that they were all heroes, that he should be stronger. And that their time was up.

Content moderators are in charge of leaving walls and feeds clean and peaceful on Facebook and Instagram. Millions of people use these platforms every day and are unaware that this dark side exists. These workers are the ones who decide whether or not to publish fake news or photos that do not comply with Meta’s policy. But they are also the ones who have to deal with the most brutal content: viewing it, evaluating it, censoring it and, if necessary, sending it to the police.

In 2018, the company CCC Barcelona Digital Services moved into the Glòries tower. The announcement was very well received by the Catalan authorities, as the subcontractor of this large technological company came to swell the ranks of innovative companies located in Barcelona, and to occupy part of a building that had just lost the opportunity to house the headquarters of the European Medicines Agency.

The company began to hire people — primarily young foreigners who spoke several languages — to moderate content from different markets. Last October, an investigation by La Vanguardia exposed the conditions under which these moderators work. Before that, Catalonia’s Labor Inspectorate launched an investigation in 2021, and the following year imposed a fine of over €40,000 ($43,314) on the company for deficiencies in the evaluation and prevention of psychosocial risks in the workplace. In 2020, the company was acquired by Canadian Telus International, which claims that the accusations are false and that they have sufficient safety measures in place.

This worker started there in 2018 and stayed until 2020, when he obtained a medical leave for his mental health problems. The company and the mutual insurance company classified it as a common illness. “We then requested a change of contingencies, because his case fit perfectly with that of occupational accident. The National Institute of Social Security agreed with us, and the company appealed, which triggered the legal process,” explains Francesc Feliu, a partner in the law firm Espacio Jurídico Feliu Fins, which specializes in healthcare issues.

On January 12, the 28th Labor Court of Barcelona rejected the company’s claim and ruled that the worker’s sick leave should be classified as an accident at work. This is the first judgment that recognizes that the mental illness a content moderator suffers is caused by his work. “Work-related stress is the sole, exclusive and unquestionable trigger” of the disorders, says the ruling, which can still be appealed. Feliu has some 25 other workers who are waiting for their illness to be recognized as an occupational accident, and in October he also filed a criminal complaint against the company, alleging the lack of safety measures.

The worker has requested anonymity, because he is subject to strict confidentiality agreements, and he prefers not to talk about how he feels or about very personal issues, because the wounds left by this work are still fresh. He is having a hard time with the news coverage of the ruling, which make him relive what he saw. “But at least this is encouraging more people to seek justice,” he notes.

When he started working at the company, he had no idea of the violence of the videos he would have to watch. “They told me, but superficially, and then when you start you see that things are much, much worse...,” he says. The lawyer explains that the work is well paid (about €2,400 [$2600] gross per month, although there are salary differences between workers who are responsible for different markets, for which another office has also gone to court), no experience or training is required, and young foreigners are attracted to it: “They say ‘look, how cool, I’ll work for Meta,” explains Feliu. The affected worker points out that the illusions don’t last long: “People are not at all aware of what is going on. Before I worked there, I assure you that I didn’t know what humans were capable of.”

The workers suspect that they were training AI

Feliu explains that at that time — ”the conditions may have changed now,” he says — the content moderators with the best efficiency scores (as determined by a monthly worker evaluation) were placed in a high-priority section. That is, they continued to receive videos of all kinds, photos and posts in which suicides and terrorist acts appeared.

That was the section where the worker from the case worked: “Constantly seeing this makes you more sensitive to everything. After a while I couldn’t even see a suicide letter,” he explains. You had to strictly follow Meta’s policy, and often watch the videos to the end, several times with different moderators. “For example, you had to keep watching a live video of someone explaining that they wanted to commit suicide, and you couldn’t delete it or alert the police if you didn’t see something in the scene that suggested suicide, a gun, an open window.... Sometimes they would suddenly pull out the gun and shoot themselves, without you being able to do anything,” he laments.

To remove a video, they had to spell out the decision: “You had to rate the video by the worst thing that happened [in it], according to a scale. If the video started with some kind of violence, you had to wait for something more serious, like murder, dismemberment or sexual abuse, to rate it as the most serious. If the most serious violence came up at first, the system would let you delete it.”

This procedure made the workers suspicious. “If you already see that something is violent at second 10, why do you have to wait? You come to the conclusion that what they are doing is training artificial intelligence (AI), that [workers] are cannon fodder,” says Feliu. When asked about this, a spokesman for the subcontractor does not clarify whether this project exists or / and refers all questions to Meta.

The company employs some 2,000 people, after cutbacks at Meta led to the subcontractor’s workforce being reduced through layoffs last year. The works council has not responded to questions from this newspaper, and the company has appealed the ruling. In a statement, Telus explains that “thanks to the comprehensive wellness program,” in December of last year, it had reduced sick leave to 14% of the workforce, and that only “between 1% and 2%” were work-related mental health leaves.

The company claims that it has engaged outside medical support, that the team has a range of counselors available to them 24 hours a day, can call for breaks and emergency sessions whenever workers see disturbing content, and has technology to blur videos or turn off the sound, if necessary. “Any suggestion that employees are constantly exposed to disturbing content for eight hours a day is false,” the note says, adding that its workers’ well-being is a priority. At the trial, the company denied that there was a link between the employee’s mental illness and his work, arguing that he had seen a psychologist when he was 16 years old.

The worker explains that when he was with the company, there were timed five-minute breaks every hour, during which time he could not go outside to get some fresh air because just going down in the elevator used up the allotted time. The lunch break was 20 minutes, and workers had activities such as yoga sessions and games, “but [there was] no specific follow-up” for employees who evaluated some 400 pieces of content every day.

In addition, the rotating schedules — one week in the morning, one week in the afternoon, one week in the evening, one week at night —disturbed their rest, “which was already difficult because of the nightmares.” Feliu says that “25% of people were systematically on sick leave, plus all those who left the job before taking sick leave.” The attorney believes that the court ruling, and future workers will help the company to change things: “Content moderators are essential for social media, but so are their [working] conditions.”




Cry me a river, janny.

 
i'm going to take a controversial stance here and sympathize with the workers. I hear the job title "content moderator" and I expect to be dealing with a lot of niggerposting and some nudes cleanup. Especially because of how censored facebook is, I don't expect to witness live suicides or be asked to make arbitrary decisions about whether to get involved.

this job is fucked up. It ruins hardened police officers, it ruins EMS workers, it ruins soldiers to constantly see death and human despair. Even the average gangbanger or cartel hitman can't kill without getting high first. It requires a lot of desensitization that changes you for life to look at this stuff and feel nothing. It shouldn't be legal to burden someone with that for call center wages.
 
his boss gave him permission to go see a company psychologist. She was on another floor, the psychology floor. He spent over half an hour talking to her, getting it all out. When he finished, she told him that his work was very important to society, that they were all heroes, that he should be stronger. And that their time was up.
I can't be the only one who thinks that sounds like the psychologist was explicitly told by the company to say this to employees. Something real sketchy is going on there

That said, the simpsons predicted facebook jannies yet again:
Though I suspect the one with the gun will actually be a pooner
 
i'm going to take a controversial stance here and sympathize with the workers. I hear the job title "content moderator" and I expect to be dealing with a lot of niggerposting and some nudes cleanup. Especially because of how censored facebook is, I don't expect to witness live suicides or be asked to make arbitrary decisions about whether to get involved.

this job is fucked up. It ruins hardened police officers, it ruins EMS workers, it ruins soldiers to constantly see death and human despair. Even the average gangbanger or cartel hitman can't kill without getting high first. It requires a lot of desensitization that changes you for life to look at this stuff and feel nothing. It shouldn't be legal to burden someone with that for call center wages.
I'm not saying that stuff isn't horrific and takes a toll on people's brains. My point is that they had to be told at some point in the job interview or training process that they would see videos of murders, rapes, child abuse, suicides, etc. Even if they weren't the internet and Facebook has been around for a long time and things like Orgish and Motherless and countless cartel killing videos are out there, so it shouldn't have been that big of a shock if they were exposed to it. But the other piece is the janny could quit at any time. Get to the end of the first video of a guy suck starting a shotgun and just say "nope. That's enough." and walk out the door.

But even if Meta wasn't a factor, and our janny saw a suicide off of a building ,or a murder in a 7-11, or a stunt gone horribly wrong,or a deadly sports injurty, or even some kid walking across the street and hit by a car, or even was a member of a jury and it was too much to deal with, then go get help. It isn't like there aren't therapists aplenty out there. But this way they get to not only make Meta foot the bill for it but also sue them for the emotional distress that went along with a job they janny willingly signed up for.

Or maybe I'm too jaded and cynical. That's a possibility too.
 
So does no one go to heavy-r anymore or whatever the fuck is the thing now?

I'm more surprised that THE mental illness that APA likes peddling to people is just ADD and autism.
Then again you can't ritalin/xanax away trauma apathy.
 
Thats A LOT for Spain especially considering its a no skill nothing job.
Who would have thought being a desensitized no life bestgore rekt connoisseur could be so profitable.
$31,000 per year is a lot in Spain? It's $15 an hour at 40 hours per week.

Even if they weren't the internet and Facebook has been around for a long time and things like Orgish and Motherless and countless cartel killing videos are out there, so it shouldn't have been that big of a shock if they were exposed to it.
Watching a horrible thing when you are curious about it, and watching horrible things over and over, 8 hours a day, 40 hours a week even if you don't want to are not in the same ballpark. Warning people they might see things they aren't able to imagine might be possible is not appropriate preparation.

I guarantee you this orientation never covered "if you don't detect immediate threats of suicide fast enough, you might witness a live suicide and have to forever live with the guilt of wondering if you could have saved them if you had been faster."

They're being put through this by the company. I know they accept the job role, but it pays $15 per hour. You can't just subject workers to physical dangers from an unsafe work environment and say "fuck you, you took the job, you could have quit" afterwards when they get injured. This is essentially an OSHA issue if you think about it. If they were saying this about having to read mean words, I would agree with you. But constantly seeing gore and suicide is well understood to cause psychological damage.
 
Last edited:
how come these people are the ones that end up doing this job instead of the kaotic/rekt thread-loving 'teens that have been sending hd footage of naked chinese men falling dick first into an industrial rock crusher to each other since they were 12
Well you see, those people may have a skewed vision of what is truly disturbing and violent, and that would hamper the ML algorithms they were trying to train. Innocent souls MUST be sacrificed in order to automate more people out of jobs. I'm actually more certain than ever that this is the case. If the job was just about hitting the delete key, they wouldn't bother with some kind of sliding scale for content rating.
 
Even FBI creeps who have to watch this stuff get rotated off to other roles, and they aren't stuck in some dystopian viewing booth 40 hours per week just watching it, they have other things to do too. And expensive shrinks keeping an eye on them, not some halfwit "counselor" whose job is to say lol stfu.

Maybe Zuckerface is trying to turn these guys psychotic for some next level scheme.
 
The police who have to review CSA material to determine that yep, that really is CSA are reviewed and debriefed very regularly, and are rotated out of the job after a fixed period of time. They still have a high rate of PTSD related illnesses.

You can see things you can't forget, and seeing them 40 hours a week for months or years on end does a number on the brain.
 
Back