EU The horrors experienced by Meta moderators: ‘I didn’t know what humans are capable of’ - A court ruling in Spain that attributes a content moderator’s mental health problems to the job paves the way for acknowledging that fact for at least 25 other employees as well


1706628515306.png

The Glòries tower in Barcelona, Spain, where Meta's subcontractor for content filtering is located.

After having to repeatedly watch numerous videos of suicides, murders, dismemberment and rapes, he had a panic attack and asked for help. This employee worked as a content moderator at a company that serves Meta, which owns Facebook, Instagram and WhatsApp. He was told to go to the fun floor: a large game room located in Barcelona’s Glòries tower, where the content moderation offices for the Californian tech giant are located. He sat staring blankly at a ping pong table. The fun floor didn’t help him at all. On another occasion, two hours after he had another panic attack, his boss gave him permission to go see a company psychologist. She was on another floor, the psychology floor. He spent over half an hour talking to her, getting it all out. When he finished, she told him that his work was very important to society, that they were all heroes, that he should be stronger. And that their time was up.

Content moderators are in charge of leaving walls and feeds clean and peaceful on Facebook and Instagram. Millions of people use these platforms every day and are unaware that this dark side exists. These workers are the ones who decide whether or not to publish fake news or photos that do not comply with Meta’s policy. But they are also the ones who have to deal with the most brutal content: viewing it, evaluating it, censoring it and, if necessary, sending it to the police.

In 2018, the company CCC Barcelona Digital Services moved into the Glòries tower. The announcement was very well received by the Catalan authorities, as the subcontractor of this large technological company came to swell the ranks of innovative companies located in Barcelona, and to occupy part of a building that had just lost the opportunity to house the headquarters of the European Medicines Agency.

The company began to hire people — primarily young foreigners who spoke several languages — to moderate content from different markets. Last October, an investigation by La Vanguardia exposed the conditions under which these moderators work. Before that, Catalonia’s Labor Inspectorate launched an investigation in 2021, and the following year imposed a fine of over €40,000 ($43,314) on the company for deficiencies in the evaluation and prevention of psychosocial risks in the workplace. In 2020, the company was acquired by Canadian Telus International, which claims that the accusations are false and that they have sufficient safety measures in place.

This worker started there in 2018 and stayed until 2020, when he obtained a medical leave for his mental health problems. The company and the mutual insurance company classified it as a common illness. “We then requested a change of contingencies, because his case fit perfectly with that of occupational accident. The National Institute of Social Security agreed with us, and the company appealed, which triggered the legal process,” explains Francesc Feliu, a partner in the law firm Espacio Jurídico Feliu Fins, which specializes in healthcare issues.

On January 12, the 28th Labor Court of Barcelona rejected the company’s claim and ruled that the worker’s sick leave should be classified as an accident at work. This is the first judgment that recognizes that the mental illness a content moderator suffers is caused by his work. “Work-related stress is the sole, exclusive and unquestionable trigger” of the disorders, says the ruling, which can still be appealed. Feliu has some 25 other workers who are waiting for their illness to be recognized as an occupational accident, and in October he also filed a criminal complaint against the company, alleging the lack of safety measures.

The worker has requested anonymity, because he is subject to strict confidentiality agreements, and he prefers not to talk about how he feels or about very personal issues, because the wounds left by this work are still fresh. He is having a hard time with the news coverage of the ruling, which make him relive what he saw. “But at least this is encouraging more people to seek justice,” he notes.

When he started working at the company, he had no idea of the violence of the videos he would have to watch. “They told me, but superficially, and then when you start you see that things are much, much worse...,” he says. The lawyer explains that the work is well paid (about €2,400 [$2600] gross per month, although there are salary differences between workers who are responsible for different markets, for which another office has also gone to court), no experience or training is required, and young foreigners are attracted to it: “They say ‘look, how cool, I’ll work for Meta,” explains Feliu. The affected worker points out that the illusions don’t last long: “People are not at all aware of what is going on. Before I worked there, I assure you that I didn’t know what humans were capable of.”

The workers suspect that they were training AI

Feliu explains that at that time — ”the conditions may have changed now,” he says — the content moderators with the best efficiency scores (as determined by a monthly worker evaluation) were placed in a high-priority section. That is, they continued to receive videos of all kinds, photos and posts in which suicides and terrorist acts appeared.

That was the section where the worker from the case worked: “Constantly seeing this makes you more sensitive to everything. After a while I couldn’t even see a suicide letter,” he explains. You had to strictly follow Meta’s policy, and often watch the videos to the end, several times with different moderators. “For example, you had to keep watching a live video of someone explaining that they wanted to commit suicide, and you couldn’t delete it or alert the police if you didn’t see something in the scene that suggested suicide, a gun, an open window.... Sometimes they would suddenly pull out the gun and shoot themselves, without you being able to do anything,” he laments.

To remove a video, they had to spell out the decision: “You had to rate the video by the worst thing that happened [in it], according to a scale. If the video started with some kind of violence, you had to wait for something more serious, like murder, dismemberment or sexual abuse, to rate it as the most serious. If the most serious violence came up at first, the system would let you delete it.”

This procedure made the workers suspicious. “If you already see that something is violent at second 10, why do you have to wait? You come to the conclusion that what they are doing is training artificial intelligence (AI), that [workers] are cannon fodder,” says Feliu. When asked about this, a spokesman for the subcontractor does not clarify whether this project exists or / and refers all questions to Meta.

The company employs some 2,000 people, after cutbacks at Meta led to the subcontractor’s workforce being reduced through layoffs last year. The works council has not responded to questions from this newspaper, and the company has appealed the ruling. In a statement, Telus explains that “thanks to the comprehensive wellness program,” in December of last year, it had reduced sick leave to 14% of the workforce, and that only “between 1% and 2%” were work-related mental health leaves.

The company claims that it has engaged outside medical support, that the team has a range of counselors available to them 24 hours a day, can call for breaks and emergency sessions whenever workers see disturbing content, and has technology to blur videos or turn off the sound, if necessary. “Any suggestion that employees are constantly exposed to disturbing content for eight hours a day is false,” the note says, adding that its workers’ well-being is a priority. At the trial, the company denied that there was a link between the employee’s mental illness and his work, arguing that he had seen a psychologist when he was 16 years old.

The worker explains that when he was with the company, there were timed five-minute breaks every hour, during which time he could not go outside to get some fresh air because just going down in the elevator used up the allotted time. The lunch break was 20 minutes, and workers had activities such as yoga sessions and games, “but [there was] no specific follow-up” for employees who evaluated some 400 pieces of content every day.

In addition, the rotating schedules — one week in the morning, one week in the afternoon, one week in the evening, one week at night —disturbed their rest, “which was already difficult because of the nightmares.” Feliu says that “25% of people were systematically on sick leave, plus all those who left the job before taking sick leave.” The attorney believes that the court ruling, and future workers will help the company to change things: “Content moderators are essential for social media, but so are their [working] conditions.”




Cry me a river, janny.

 
Makes you wonder where they put the people that yeet your account for saying the 2020 election had a few irregularities.
Powerlevel but I was a janny in the past and they have a specific department for that stuff full of people that only work with content that questions the elections. Not all jannies have to watch the most vile shit imaginable, I was one of the ones who dealt with stuff that wasn't allowed but was harmless. These people are generally very petty and gloat about taking down stuff they don't like/regret they can't technically take down stuff they disagree with because it doesn't violate the policies. If they think it's too hard to manage, they can find another job. When I was working as that I had a lot of free time to take my mind off the more traumatizing shit and we had a psychologist ready to deal with job related problems. Sucks that someone has to watch all the cp, but there are plenty of jannies who act like reddit jannies except they get paid for it and all they do is deal with people saying nigger or questioning elections integrity. Leaving the best they could do, it don't pay good.
 
i'm going to take a controversial stance here and sympathize with the workers. I hear the job title "content moderator" and I expect to be dealing with a lot of niggerposting and some nudes cleanup. Especially because of how censored facebook is, I don't expect to witness live suicides or be asked to make arbitrary decisions about whether to get involved.

this job is fucked up. It ruins hardened police officers, it ruins EMS workers, it ruins soldiers to constantly see death and human despair. Even the average gangbanger or cartel hitman can't kill without getting high first. It requires a lot of desensitization that changes you for life to look at this stuff and feel nothing. It shouldn't be legal to burden someone with that for call center wages.

To add to this, once a person develops PTSD after being subject to repeated traumatic imagery, it causes changes to the sympathetic nervous system, fight or flight response. After this happens, subsequent images can internally trigger the same feelings as if someone were lunging at you with a knife.
 
To add to this, once a person develops PTSD after being subject to repeated traumatic imagery, it causes changes to the sympathetic nervous system, fight or flight response. After this happens, subsequent images can internally trigger the same feelings as if someone were lunging at you with a knife.
Mirror neurons. We know quite a bit about them now, but admitting that what we watch affects our brains has uncomfortable implications for free will, free expression, etc.
 
At the trial, the company denied that there was a link between the employee’s mental illness and his work, arguing that he had seen a psychologist when he was 16 years old.
When he finished, she told him that his work was very important to society, that they were all heroes, that he should be stronger.
Psychologists are subhuman. Do not go to a psychologist. Do not send your child to a psychologist. (Anyone who willingly goes to a psychologist should lose rights, but the poor kid here was 16, the parents probably sent him, he had no choice.)

Do not say you're crazy UNLESS you're crazy enough, AND sane enough, to think surrendering to The Man is better for you. Especially do not say it because your insurance pays for muhmentalhealth and you "might as well" use it because it's "free".

---
So do they want to work in it or not? You can't complain that you have mental damage from a job and then complain that there are plans to automate the job. This kind of shit really highlight how those peopl want to feel like victims ror the job of watching videos for eight hours and push a button, while already being coddled
This. It's good when risky and harmful jobs get automated. These people are grifters (although I don't mind them trying to milk Zuck, you do you, jannies).
 
Mirror neurons. We know quite a bit about them now, but admitting that what we watch affects our brains has uncomfortable implications for free will, free expression, etc.

I think you’re pointing out the function of a civilized society and why people in generaI feel it needs to exist for various reasons.

For example, I like waving my dick around in traffic.

Yet society says that’s “a crime.”

So I can come to this site and post videos of me waving my dick around to all of you and you’ll defend my right to free expression.

We all win, except for, specifically, all of you.

Of course the problem arises when another dude who likes offending society comes along and starts posting my dick waving videos in Facebook groups for rape victims.

To at least one person out there, who’s been the victim of a dick wave that gave them PTSD, my fun is their hell, even if I was just waving my dick around to all of you and had no intention of waving my dick on Facebook.

So all we are left to criticize is the guy who wants to offend unintended viewers.

Are that guy’s right’s to show people my dick waving videos to others more important than a person’s right not look at my dick waving around?

Does freedom of expression give one person the right to traumatize another person and if we disregard the rights of that person then are we not just violators of other people’s rights, in the pursuit of our own?

Doesn’t that mean that basically everyone on both sides are just hypocrites?

As we all know, the worst thing is the hypocrisy.

I think that’s the whole debate here.

It’s not going to be solved.
 
Last edited:
Another thing worth mentioning is that there's a huge amount of animal abuse. I can watch a guy shoot himself. An animal being hurt, tortured, or killed? Absolutely not.
I watched hours of riots every night in 2020, including people getting shot dead and beaten until they were comatose, but the video from that year that haunts my nightmares is the one where the fucking chink coof police beat someone's corgi to death off camera.

That said, the system they have at Facebook still makes no sense to me. They have it down to automation for things that aren't really offensive- one of my friends quipped that his dog was "a very stupid boy" and it got instantly "hidden" as I guess misandrist rhetoric. That level of stuff isn't getting past the first level of automod, but snuff videos are?
 
They hired a bunch of normies for this. People that probably cry when someone makes a mean comment on their Instagram.
Yeah, I would imagine that most of the moderation team are communication majors fresh out of college. I don't blame them for not being ready for Daisy's Destruction type horror.
 
  • Like
Reactions: Rich Evans Simp
I am on the jannies side here.
Even the strongest person will break down if they have to see CSAM, rape murder etc etc etc every day.
I'd image people working in slaughterhouses would buckle first yet they don't. I'd also imagine they spend their vast majority of the time deleting "politically incorrect" posts.
 
You had to strictly follow Meta’s policy, and often watch the videos to the end, several times with different moderators.

This is insane. The first moment you see something not allowed, you should be able to say "nope" and shut it down, with a special "alert law enforcement" button for really bad shit.

Why did they want them watching all the way to the end? What is gained by such a thing?
 
This is insane. The first moment you see something not allowed, you should be able to say "nope" and shut it down, with a special "alert law enforcement" button for really bad shit.

Why did they want them watching all the way to the end? What is gained by such a thing?
I can sort of see a justification in some situations. Say you find that someone has uploaded a CSAM video. You start watching it, can see where it's going, stop, take a break to punch a wall, write up your report, it's even a known video that has a hash for the part that you saw, so you click off... only this one wasn't the same length as the known video. It's a bit longer, and after the part you stopped watching, they bring in another kid, and that kid is one that wasn't in the database, which would have been important information for investigators, but no one will know because no one watched all the way through. So there is potential evidentiary value in determining the full extent of the content.
 
  • Like
Reactions: Luna Tick
In forensics we had to "desensitize" and look at pictures of crime scene shit and dead bodies. Some of them were brutal and I still can't get over it even though I was browsing the internet since 08 and seen a lot of shit I never asked for. (Yes I am a newfag fuck off.)
In a way I thanked the early internet for prepping me for some shit but dear god some of it is still hard to digest. Worst part is to, not related to forensics but to another class, I had a dumb idea to NOT be too desensitized where I was a robot, I wanted to still retain SOME form of empathy when looking at images or talking to people. It's hard. Sometimes I try to turn myself "off" like a switch but my brain can't do that. I still get sad if I come across animal abuse, or some innocent victim getting killed for no reason, etc etc.
I don't think it's for everyone, sometimes the money isn't worth it. But now I wanna sperg and think, would someone depraved of any empathy/disgust do a better job watching these kinds of things and not be bothered?
 
HAAHAHAHAHAH GET FUCKED JANNIES



 
Last edited:
Thats not enough! I will not Stop till they have a suicidenet around their offices...
 
After having to repeatedly watch numerous videos of suicides, murders, dismemberment and rapes, he had a panic attack and asked for help.
Man, maybe it should have banned those things instead of banning people for wrong think?

Edit:
Seriously, they managed to shut down any and all contrary speech, but somehow they weren't able to shut this sort of thing down.
 
  • Like
Reactions: SofondaCox
But constantly seeing gore and suicide is well understood to cause psychological damage.

You can see things you can't forget, and seeing them 40 hours a week for months or years on end does a number on the brain.
It's why I appreciate these types of jannies. Not like ResetEra mods or other reddit-style tranny jannies, but there is an ever-replenishing supply of people who post this shit wherever they can.

Give me the broom (and the six figure salary). I will sweep it up!
Well, you know where to apply, and could work overtime to get the extra salary. I look forward to hearing your success story!

How much gore, rape, etc gets posted on Facebook anyway?
Far more than you'd think, and it's not just the live videos of murders and suicides. Bots and hacked accounts are common, and they don't all want to sell you sunglasses. We don't see it because of jannies, and they are woefully undercompensated. Compartmentalising isn't easy, and also doesn't always work - not just cops and first responders, but even the high suicide rates of veterinarians, who spend far more time dealing with dead and dying animals than they do with happy puppies or cases where the patient pulls through and goes back to their loving owners.

The cavalier attitude, as if browsing WatchPeopleDie is the same as getting a constant flow of surprise child porn or gangland torture and zoosadism, and not being allowed to look away at the same time, is annoying but also wryly naive posturing. If you're getting paid to watch shit you'd watch anyway, it sounds like basically free money! You could even have fun marking posts and suspending accounts from whiners while you do it - double win! If you love what you do, you'll never work a day in your life, amirite? Your new career path awaits!
 
  • Like
Reactions: Cereal Killer
Back