🐱 YouTube’s algorithm is reportedly helping pedophiles find home videos of kids

CatParty

YouTube’s algorithm is reportedly facilitating pedophiles by recommending videos of young children to them. YouTube announced Monday that it’s changing its livestreaming policy amid news of the reports.

Someone who watches sexually themed content on the platform could slowly be presented with videos of younger and younger women before eventually landing on videos of children, according to a team of researchers at the Harvard University’s Berkman Klein Center for Internet and Society

A Brazilian woman was informed that an innocent home video of her 10-year-old daughter and a friend playing in a backyard pool suddenly garnered more than 400,000 views after being recommended to those who had watched “videos of prepubescent, partially clothed children,” according to one of the researchers’ examples.

“It’s YouTube’s algorithm that connects these channels,” research co-author Jonas Kaiser, wrote. “That’s the scary thing.”

After being alerted to the issue, the company banned younger minors, who are 13 and younger, from livestreaming without an adult present.

YouTube also removed several videos and appeared to alter its algorithm, according to the New York Times.

A YouTube spokesperson directed the Daily Dot to its blog post addressing the discovery. The post details the company’s new livestreaming policy and other work to “protect minors and families.”

In March, it disabled comments on videos of children after it was learned that pedophiles were sharing time stamps to sections of video that showed nudity.

“The vast majority of videos featuring minors on YouTube, including those referenced in recent news reports, do not violate our policies and are innocently posted—a family creator providing educational tips, or a parent sharing a proud moment,” the company wrote.

The company also reduced recommendations of “borderline content,” including videos featuring minors in risky situations.

The researchers argue that YouTube refuses to disable its recommendation system altogether on videos of children because doing so would hurt its independent creators.

The issue is not the first related to minors for the website.

The discovery of a “softcore pedophile ring” on the platform by a YouTuber in February led Disney to stop purchasing ads on the site.

A spokesperson told the Daily Dot at the time that it hired child safety experts who are dedicated to catching those who wish to harm children. The company also said it terminates thousands of underage accounts each week to help combat the problem.
 
This shit is much closer to the surface than you'd think. Did a book report on The Blue Lagoon, so I watched a couple clips from the movie, and instantly in the recommendations there was skeevy shit like "13 year old first kiss" and thumbnails of precariously young girls in tanktops.
Open an incognito tab and try it yourself, it's pretty fucked how easy it comes up from something as mainstream as a movie clip. Makes you wonder how often kids get those kind of recommendations.

Why they don't just program their idiot robot to NOT categorize videos with things like "half naked children kissing", I don't know.
 
Anybody who posts videos or pictures of their children online is naive or stupid.
I miss how the internet used to be: never post pictures of yourself, don't give out your real name, and don't tell people where you live. In other words, don't dox yourself. And especially don't dox your kids.

Then came social media and "Web 2.0."
 
Why they don't just program their idiot robot to NOT categorize videos with things like "half naked children kissing", I don't know.
The poor sap who probably programmed it was naive enough to think someone wouldn't use it for heinous shit. I don't think "better put in a special case her to stop someone from using this to find softcore kiddie porn" crossed their mind. So much for optimism.
 
Younger and younger women? So if you're exceptional enough to use YouTube as your porn site, it will push your limits on how young is too young?
 
Anybody who posts videos or pictures of their children online is naive or stupid.
Anyone who gives their kid a phone that has a camera and an internet connection are naive and stupid. Kids are clever enough to bypass age restrictions, but too stupid to be cognisant about what they are posting online or who they are posting it to. A lot of them are also clever enough to know that what they are doing is wrong and make damn well sure within their ability that their parents don't find out. It honestly baffles me that kids are allowed to have camera phones, so much content of exploitation would stop being produced once they weren't able to produce it themselves at the beck and call of silver tongued predators.
 
Back