YouTube will “protect free expression” by pulling back on content moderation - YouTube says it is still committed to preventing harm.

1.webp
Credit: Getty Images

YouTube videos may be getting a bit more pernicious soon. Google's dominant video platform has spent years removing discriminatory and conspiracy content from its platform in accordance with its usage guidelines, but the site is now reportedly adopting a lighter-touch approach to moderation. A higher bar for content removal will allow more potentially inflammatory content to remain up in the "public interest."

YouTube has previously attracted the ire of conservatives for its removal of QAnon and anti-vaccine content. According to The New York Times, YouTube's content moderators have been provided with new guidelines and training on how to handle the deluge of provocative content on the platform. The changes urge reviewers to pull back on removing certain videos, a continuation of a trend not just at YouTube, but on numerous platforms that host user-created content.

Beginning late last year, YouTube began informing moderators they should err on the side of caution when removing videos that are in the public interest. That includes user uploads that discuss issues like elections, race, gender, sexuality, abortion, immigration, and censorship. Previously, YouTube's policy told moderators to remove videos if one-quarter or more of the content violated policies. Now, the exception cutoff has been increased to half. In addition, staff are now told to bring issues to managers if they are uncertain rather than removing the content themselves.

"Recognizing that the definition of 'public interest' is always evolving, we update our guidance for these exceptions to reflect the new types of discussion we see on the platform today," YouTube's Nicole Bell told the Times. "Our goal remains the same: to protect free expression on YouTube while mitigating egregious harm."

The report includes several examples of videos that are now allowed under YouTube's policy, including one titled, "RFK Jr. Delivers SLEDGEHAMMER Blows to Gene-Altering JABS." This is, of course, an example of medical misinformation, as there is zero evidence that mRNA vaccines can alter a person's DNA. However, YouTube's moderation team was instructed to allow the video to remain up because the public interest "outweighs the harm risk."

2.webp
Vaccine and health care conspiracies promoted by Robert F. Kennedy Jr. could get a boost with YouTube's new policies.
Credit: Getty Images | Washington Post

Policy change​

There have been several notable shifts in how online platforms manage content since the reelection of Donald Trump last November. For several years, conservative voices have framed content moderation on sites like YouTube and Facebook as censorship. With MAGA Republicans in power, there is concern in the tech industry that such an outlook could become the basis for government action against websites. Not long after the election, Meta announced the end of its fact-checking system, and Twitter famously changed course immediately after being purchased by right-wing billionaire Elon Musk. Even before November, YouTube loosened restrictions on election-related content. Again, this was done in the "public interest."

YouTube contends that most of the videos it hosts are not affected by this change. "These exceptions apply to a small fraction of the videos on YouTube, but are vital for ensuring important content remains available," a YouTube spokesperson tells Ars. "This practice allows us to prevent, for example, an hours-long news podcast from being removed for showing one short clip of violence."

While it's unclear how often YouTube traps viewers in conspiracy spirals, the effect could escalate with this change. No matter how small the fraction is, many of the accounts most apt to publish controversial content are big names on the platform.

Article Link

Archive
 
Also, does this mean YT will stop "demonetizing" the "content creators" for using words like "ass" or "rape" or "suicide" in vids now?

:optimistic:

(Such idiotic and censorious policy got "Zoomers" to use "Zoomer" slang substitutes like "ahh" and "grape" and "sewer slide" instead.)
>sewer slide

Cowabunga dude!

1749510902413.webp
 
Last edited by a moderator:
Content moderation is completely necessary and a thankless job, but the caliber and type of person generally attracted to such jobs (paid or unpaid) is abysmal. It becomes even worse in a corporatized environment with something of consequence. Power tripping thin-skinned weenies trying to feel important or that they're on some type of moral crusade to shape the world to their vision or police thoughts and behaviors they don't agree with clinging to any authority or recognition they can get. The last people on earth who should hold such jobs are those who generally want them. Think of some of the decisions you've seen on various sites, then imagine what a little bitch someone has to be to have made that decision.

The quality of any community is dictated by the users and the extent to which the content moderators fit in and don't believe they're the main driving force crafting something to their vision. The more they try to make themselves the star of the show or pat themselves on the back the more terrible the community, without fail.
 
Back