🐱 YouTube’s algorithm is reportedly helping pedophiles find home videos of kids

CatParty

YouTube’s algorithm is reportedly facilitating pedophiles by recommending videos of young children to them. YouTube announced Monday that it’s changing its livestreaming policy amid news of the reports.

Someone who watches sexually themed content on the platform could slowly be presented with videos of younger and younger women before eventually landing on videos of children, according to a team of researchers at the Harvard University’s Berkman Klein Center for Internet and Society

A Brazilian woman was informed that an innocent home video of her 10-year-old daughter and a friend playing in a backyard pool suddenly garnered more than 400,000 views after being recommended to those who had watched “videos of prepubescent, partially clothed children,” according to one of the researchers’ examples.

“It’s YouTube’s algorithm that connects these channels,” research co-author Jonas Kaiser, wrote. “That’s the scary thing.”

After being alerted to the issue, the company banned younger minors, who are 13 and younger, from livestreaming without an adult present.

YouTube also removed several videos and appeared to alter its algorithm, according to the New York Times.

A YouTube spokesperson directed the Daily Dot to its blog post addressing the discovery. The post details the company’s new livestreaming policy and other work to “protect minors and families.”

In March, it disabled comments on videos of children after it was learned that pedophiles were sharing time stamps to sections of video that showed nudity.

“The vast majority of videos featuring minors on YouTube, including those referenced in recent news reports, do not violate our policies and are innocently posted—a family creator providing educational tips, or a parent sharing a proud moment,” the company wrote.

The company also reduced recommendations of “borderline content,” including videos featuring minors in risky situations.

The researchers argue that YouTube refuses to disable its recommendation system altogether on videos of children because doing so would hurt its independent creators.

The issue is not the first related to minors for the website.

The discovery of a “softcore pedophile ring” on the platform by a YouTuber in February led Disney to stop purchasing ads on the site.

A spokesperson told the Daily Dot at the time that it hired child safety experts who are dedicated to catching those who wish to harm children. The company also said it terminates thousands of underage accounts each week to help combat the problem.
 
The algorithm is just doing its job, there's the thing. What we're seeing here is akin to the movies where the machines decide that it is most logical to kill humanity, because they are dumb machines. They have to be told what not to do.
 
there was skeevy shit like "13 year old first kiss" and thumbnails of precariously young girls in tanktops.
Open an incognito tab and try it yourself, it's pretty fucked how easy it comes up from something as mainstream as a movie clip. Makes you wonder how often kids get those kind of recommendations.

Why they don't just program their idiot robot to NOT categorize videos with things like "half naked children kissing", I don't know.

I think is because kids are obviously sluts by not completely covering their skin and making this YouTube porn videos, I think we should put a burka on all of them and censor any media depicting children, that is totally going to stop any real child for being molested ever again.
 
Anyone who gives their kid a phone that has a camera and an internet connection are naive and stupid. Kids are clever enough to bypass age restrictions, but too stupid to be cognisant about what they are posting online or who they are posting it to. A lot of them are also clever enough to know that what they are doing is wrong and make damn well sure within their ability that their parents don't find out. It honestly baffles me that kids are allowed to have camera phones, so much content of exploitation would stop being produced once they weren't able to produce it themselves at the beck and call of silver tongued predators.
Shit adults themselves are terrible at how they handle the internet I certainly don't expect a kid to do too well either tbh.
 
So did the researchers play it like a The Price is Right game? Like the mountain climber game but they just keep going until they get too young to give them a boner?
 
  • Thunk-Provoking
Reactions: greengrilledcheese
If you know how Google's algorithms work and you understand both programming & the English language to a certain degree, you can repurpose their searches to do all sorts of shit. That's the way they were designed.

A good expert system always mirrors the desires of its users, whether they're looking for PC Engine ROMs or sating a dubious prurient interest. Trying to imply these programs have some sort of Agenda is beyond exceptional and the sort of magical thinking bullshit I'd expect from someone who doesn't understand how society functions but thinks they know how it ought to.
 
Too many people think coding is wizardry and think computer "AI" programs have any semblance of human intelligence instead of being extremely literal toddlers operating at lightspeed.

So pedos have successfully molested even the toddler-aged stage of our future AI overlords? That Terminator remakquel is gonna be depressing when they get to it.
 
The only way to completely solve this is to ban vids of children from the platform. Pedophiles will jerk it to ANY video of a kid no matter of innocent. Bare feet? Jerk it. Visible arms? Jerk it. Swimsuits? Fucking insta-nut. You have to adopt scorched-earth policies to defeat pedos, because they'll use ANYTHING.

Also, youtube's algorithm prioritizes watch-time so it'll send you down a wormhole of what it thinks you want to see. The biggest lie is that the algorithm is "broken", Nope. Its working precisely as intended.
 
I wonder how much money it would cost to buy the customer info on everyone searching for pedobait...
 
You would think youtube would ban pro pedo channels or registered sex offenders from having an account but no.
 
Last edited:
The only way to completely solve this is to ban vids of children from the platform. Pedophiles will jerk it to ANY video of a kid no matter of innocent. Bare feet? Jerk it. Visible arms? Jerk it. Swimsuits? Fucking insta-nut. You have to adopt scorched-earth policies to defeat pedos, because they'll use ANYTHING.

Also, youtube's algorithm prioritizes watch-time so it'll send you down a wormhole of what it thinks you want to see. The biggest lie is that the algorithm is "broken", Nope. Its working precisely as intended.
Hahaha, back when it was just some conservatives getting banned any suggestion that there was a pattern to it was shouted down with "Haha stupid conservatives believing conspiracy theories you idiots just can't follow the rules."

Some random LGBT video gets temporarily demonetized and suddenly "The algorithm is making those 'mistakes' on purpose! Alex Jones was right about everything! Youtube is run by nazis!"
 
Back