🐱 Former YouTube content moderator describes horrors of the job in new lawsuit

CatParty


A former YouTube moderator is suing YouTube, accusing it of failing to protect workers who have to catch and remove violent videos posted to the site.

The suit filed Monday in California Superior Court in San Mateo says the plaintiff was required to watch murders, abortions, child rape, animal mutilation and suicides. As a part of moderator training, the company allegedly presented a video of a “smashed open skill with people eating from it,” a woman who was kidnapped and beheaded by a cartel and a person’s head being run over by a tank.




YouTube parent company Google faces increasing pressure to control content spanning violence and misinformation — particularly as it approaches the 2020 U.S. election and antitrust investigations from state attorneys general, the Department of Justice and Congress.

The plaintiff, who is referred to as Jane Doe, worked as a YouTube content moderator for staffing contracting firm Collabera from 2018 to 2019, her lawsuit said. She claims she experienced nightmares, panic attacks and suffered an inability to be in crowded areas as a result of the violent content she viewed while working for the company.

YouTube’s “Wellness Coaches” weren’t available for people who worked evening shifts and were not licensed to provide professional medical guidance, the suit says. It also alleges moderators had to pay for their own medical treatment when they sought professional help.

Neither YouTube nor Collabera responded to requests for comment.

The suit says many content moderators remain in their positions for less than a year and that the company is “chronically understaffed,” so moderators end up working overtime and exceeding the company’s recommended four-hour daily viewing limit. Despite the demands of the job, moderators had little margin for error, the suit said.




The company expects each moderator to review 100 to 300 pieces of video content each day with an “error rate” of 2% to 5%, the suit said. The companies also control and monitor how the videos are displayed to moderators: whether in full-screen versus thumbnails, blurred or how quickly they moderators watch in sequence.

The suit comes as moderators for social media companies speak out on the toll the job takes on their mental health. YouTube has thousands of content moderators and most work for third-party vendors including Collabera, Vaco and Accenture. The San Francisco-based Joseph Saveri Law Firm, which is representing the plaintiff, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.

It shows YouTube may need to provide more resources for the people who need to remove videos that violate the rules. YouTube has reportedly revertedback to relying on humans to find and delete content after it used computers to automatically sift through videos during the coronavirus pandemic. It switched back to human content moderators because computers were censoring too many videos that didn’t violate rules.
 
Wrongthink must be purged because it's an existential threat to Our Democracy™️. But snuff vids? A-okay.

Edit: why are they conflating beheadings and animal cruelty with abortion? Abortion is a stunning and brave sacrament of women's bodily autonomy. And child sexual abuse? That doesn't happen ever, stop with the QAnon conspiracy theories.
 
YouTube being both understaffed and overworking existing employees with coaches who aren't qualified in the slightest to provide them help? Imagine my shock. If they can't even handle simple copyright claims imagine them trying to sit through these videos lmfao. It's been a fact for a while that the inner workings of YT are horrendous, but on the bright side, this lawsuit might result in less generic "family friendly" and celebrity channels being worshipped.
 
If she wins, this case sets the precedent than any moderation of anything is a risk because the moderators might sue you. So as Sjw as she sounds she's actually doing a great service for those of us against censorship.
Or moderation will only ever be left to automated "machine learning" bots and the most (openly) loyal of the company's employees.
 
If she wins, this case sets the precedent than any moderation of anything is a risk because the moderators might sue you. So as Sjw as she sounds she's actually doing a great service for those of us against censorship.
There already was a precedent set; from the article above:


The San Francisco-based Joseph Saveri Law Firm, which is representing the plaintiff, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.
 
I think there were other articles in the past that implied content moderation on large websites in general meant dealing with sanity-warping amounts of depravity.

Just look at law enforcement agencies that deal with shit like child abuse. If officers with a hardened mindset can only handle that work for a couple of years, how can the average computer nerd cope?
 
I'm really curious about the smashed open skull video. What people? What was going on?
 
These "content moderators" are truly weak as fuck. It sounds like a pretty sweet gig IMO, you're basically getting paid to watch liveleak.

I would assume these contract firms are hiring fresh fom the college, preferably those with liberal arts or gender studies degrees wall hangings and pay them barely above minimum wage.
Simple life fact: Snowflakes melt in the heat.
 
Part of the interview for that position should be seeing how many oldschool shock websites the candidate can name off the top of their head. If it's more than three, they're automatically hired.

Either that or see if they can stomach scrolling through EDs offended page.

 
Back