Culture Youtube gonna be tougher with content control, working with organizations such as ADL - Pepe is gonna be banned.

https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html

A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.

We wanted to give you an update on these commitments:

Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.
We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.

More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.

Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.

Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.

And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.

Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.

The YouTube Team

This gonna be gud.
 
I hear from a lot of youtubers that ad revenue across the board is just dropping like a rock, regardless. People will stop uploading to youtube every week if it stops earning them money. I don't think Google realizes that they won't get the same amount of dedicated video creators putting out kwality kontent every week if they don't get paid to do it.

This is exactly why the entire thing is doomed to fail. They're either going to wind up having to resign to the fact that they can't excise the shitlords, or destroy the delicate balance and then be shocked when this thing bleeds money at Twitter levels. In the case of Twitter, they fucked up until the Normies couldn't ignore it any longer, and if they try to push things too hard, it will happen. It doesn't even have to cause a big migration, Digg-style; just enough to get noticed by the suits in the accounting department and that's when they'll start scrambling to compensate. Pewdiepie is already on the side of the shitlords, and frankly, so are most of the biggest content creators on Youtube.

Again, this problem is self-correcting; I have zero confidence that Google will do the right thing because it should and makes sense. What I do have absolute confidence in is self fucking preservation, and any company retarded enough to try to mutate into how Twitter curates things deserves to follow it right into the fucking gutter.
 
When will Youtube learn that removing a Moon Man song just makes two mirrors get put up in its place?
Hail Hydra.
tumblr_nx1rq2i4ZR1udhmmmo5_500.gif
 
Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.

The dumbest part about this is that Youtube is currently the epicenter platform for online discussion these days. This proverbial monkey wrench will just shuffle Jihadi recruiters into dimmer light where the police have to spend more time and money to find them when they were more than happy to publicly advertise themselves before to the open world. It's like getting mad at those criminals for incriminating themselves by livestreaming their crimes on Facebook. Why this is even a concern for advertisers boggles me. Doesn't Youtube enable you to advertise to specific groups anyway? It really seems like this investigative candy could be ignored by everybody save for the feds, whose jobs are made easier by Youtube giving Jihadists a platform to start with.
 
If YouTube wants to commit financial suicide, more power to them. You can just mirror or post your videos on a competing site to avoid all that bullshit. If they don't get the message via their pocketbook, there really is no hope left for them.

Let them drown in SJW bullshit as they go bankrupt at this point. No use in reasoning with an unreasonable person.
 
I'd say that in the end big companies would catch up to the fact that the only people watching their ads are dumb babies left with an iPad as a baby sitter, but official music videos probably are appealing enough to make them not care.

I personally feel that in 10 years, the landscape won't have changed much, aside for even less traditional content creators and more spammy stuff listed above. It will be TV 2.0 or something very close, I can bet on it. Also, no real "big" competitor will ever arise, only "decent" alternatives that will come and go.

When Youtube was first bought out and Dailymotion first hit the scene, Youtubers like Blunty3000 told their subscribers immediately Youtube's plans to become the new television, and they started pimping Dailymotion as an alternative. Blunty3000 gave the first heads-up that Google was going to implement pop up ads at the bottom 20% of the video you could click off. I'm surprised nobody has brought this up and how quickly this factoid was forgotten and for how long it's been forgotten for.
 
There are unconfirmed rumors from Anons inside Google that Anita Sarkisian is directly working with the ADL to pick what content gets censored.

Supposedly they've been intentionally not mentioning her to avoid backlash.

If true, this is gonna be an even bigger shitstorm than we thought.


lolololololol
 
YouTube is so ubiquitous that I bet if they start making controversial content undiscoverable creators will just start trading direct links around instead in lieu of jumping ship wholesale to a different platform. The hit to their viewership statistics is going to be minimal, political stuff is a small part of the overall Youtube space.

Unlisted video central, here we come!
 
  • Feels
Reactions: Burgers in the ass
When Youtube was first bought out and Dailymotion first hit the scene, Youtubers like Blunty3000 told their subscribers immediately Youtube's plans to become the new television, and they started pimping Dailymotion as an alternative. Blunty3000 gave the first heads-up that Google was going to implement pop up ads at the bottom 20% of the video you could click off. I'm surprised nobody has brought this up and how quickly this factoid was forgotten and for how long it's been forgotten for.


Because people rather reee over a made up sjw take over than realize YouTube is a business that wants to make money. You really can't sell ads on some manchild on his webcam complaining about vidya.
 
There are unconfirmed rumors from Anons inside Google that Anita Sarkisian is directly working with the ADL to pick what content gets censored.

Supposedly they've been intentionally not mentioning her to avoid backlash.

If true, this is gonna be an even bigger shitstorm than we thought.

GoogleGate here we come!
 
  • Feels
Reactions: Burgers in the ass
There are unconfirmed rumors from Anons inside Google that Anita Sarkisian is directly working with the ADL to pick what content gets censored.

Supposedly they've been intentionally not mentioning her to avoid backlash.

If true, this is gonna be an even bigger shitstorm than we thought.

Well I mean it's plausible, since Google Ideas was created. And staffed with many lolcows/ratkings on this site. They eventually rebranded to Jigsaw, then were moved under Alphabet, Googles parent company. Interestingly Jigsaw isn't listed under Alphabet in linkedin, and the staff roster seems to be largely absent. So you can't even check if shes still working for them or not.

jigsaw.png



jigsaw3.png


jigsaw2.png
 
Back