Culture Youtube gonna be tougher with content control, working with organizations such as ADL - Pepe is gonna be banned.

https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html

A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.

We wanted to give you an update on these commitments:

Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.
We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.

More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.

Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.

Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.

And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.

Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.

The YouTube Team

This gonna be gud.
 
did he sign a non disparity contract? because if so he is in breach and in the wrong. the majority of corporations will have this in your employee contract.
I'd say that depends on whether Google would have that or not. But I'm sure they'd fire him regardless of that clause.

He did the right thing after all, have to find some way to silence him so more people can't do the same.
 
I'd say that depends on whether Google would have that or not. But I'm sure they'd fire him regardless of that clause.

He did the right thing after all, have to find some way to silence him so more people can't do the same.

like i said most companies have that written into your contract. especially one that is publicly traded.
 
Regardless, firing him immediately after he published his manifesto just proves his point. There is complete intolerance of alternate ideas. People who stayed home are children. I'd have fired them instead. If you can't tolerate ideas different from yours, how the fuck are you going to work under a boss?

'No, see what you are doing is wr-'

'OH, I SEE. ITS WRONG BECAUSE I AM A WOMAN, RIGHT? THAT'S IT. ITS BECAUSE YOU ARE AN INCREDIBLE SEXIST. GET ME HR! SEXUAL DISCRIMINATION! SEXUAL DISCRIMINATION!'

*meekly goes away while other male engineers have to constantly fix her shitty work because nobody wants to fire her because they'd get accused of sexism. She gets to be incompetent and do no work*

I am guessing this is why the diversity memo has so much support. People hired for diversity aren't stupid and wield that shit like a weapon. Everyone else has to pick up their slack, including, guess who, other women. You can easily tell them apart. The incompetent ones are always bitching about sexism and patriarchy holding them back. The competent ones don't say a word. And believe me, nobody likes cleaning up after an incompetent diversity hire that management is too afraid to fire.

I'm going to tell you a little secret: Women in STEM, who want to be in STEM, do not give a single shit about your genitals. They're like the men: they want results. And the last thing they want to be thought of as having an easier time because of their reproductive plumbing. They'd be horrified at the thought. Eventually, this is going to bite companies in the ass. Competent people are going to flee to companies that hire based on talent. Its already basically destroyed Marvel comics.

This is why there is a ton of resistant to fake geek girls, fake girl 'I LOVE SCIENCE' and shit. Because its an obvious ploy. And people aren't going to be super accepting when they were mocked for their nerdy hobbies like programming and gaming years ago, but now that its profitable everyone pretends its cool.
 
did he sign a non disparity contract? because if so he is in breach and in the wrong. the majority of corporations will have this in your employee contract.

Do you mean nondisparagement? That refers to public statements. He didn't make this as a public statement. Someone else leaked it.
 
Apparently streams on problematic content can now be taken down before they've started.

e9b3b383f204ad0be0e70ac9eb852e5d9ab7c2f53daafef25027372abb83e5d7.jpg


fbdd6010462aa87be680240a8d552572b893c323a57e69338e1e3a260a71d640.jpg
 
Since this was bumped back up...


I know this guy has a thread here, but he keeps his character in check for the interim annnd has some pretty interesting evidence that YT is actively targeting certain channels just for existing despite what the content may be.
 
Earlier, @AnOminous, @Null, and several others (including myself) chimed in on how Google and friends doing this was going to inevitably result in legal ramifications, since they're essentially curating content on the website and ergo potentially ceding their safe harbor restrictions. In that same vein, more recently, several commentators, such as Mister Metokur, suggested that if we're going to lose any ability to say anything on the internet, we may as well weaponize Net Neutrality.

Well, apparently at least one major activist organization agrees:

The memo, written by Phil Kerpen of an activist organization called American Commitment, suggests that conservative lawmakers could reframe the battle over net neutrality into a larger battle over neutral speech on the internet.

"Social media (Facebook, Twitter) and search (Google) companies with dominant market position represent themselves as politically neutral while systematically promoting liberal views and limiting or even banning conservatives," begins the letter.

Right now, online platforms are protected under the Communications Decency Act from liability for material their users post. For instance, while a newspaper can be sued for libel if it prints something untrue written by a reporter, Facebook cannot be sued because a user posts something libelous.

Kerpen's memo suggests that the government should make internet companies promise that they're going to treat all content equally or lose this protection.

"Platforms that represent themselves to the public as neutral would be subject to enforcement actions if they violate those representations (of neutrality) through a consumer-protection framework."

The political calculus here seems to be that Google and Facebook would look like hypocrites if they oppose this kind of regulation while at the same time arguing for regulation of internet access providers.

"By simply proposing this framework, the Trump administration would make clear that the asymmetry of companies identified with conservative causes risking regulatory retaliation while companies identified with liberals are given a free pass is over."
Kerpen told Axios that the memo represented only "preliminary thoughts."


Additionally, at least one channel who has contacted Youtube asking why their videos were demonetized has been told that they need to "make their content age-appropriate for a five-year old" to be suitable for monetization.
 
Forgotten Weapons is an autistic YouTube channel about (primarily old) guns. YT told him specifically to censor out swastika's in thumbnails, even if it's part of this historically accurate flag motif thing. This of course pissed off a bunch of history lovers and he even made a video about it.
upload_2017-8-19_12-13-25.png

Today he uncensored the flag
upload_2017-8-19_12-14-38.png


I don't know where I should have posted this. I mean, GhostGunner is a firearm related thing thanks for putting me in the right thread, hard to keep track.

Found this in the comments
upload_2017-8-19_12-33-29.png
 
Last edited:
Earlier, @AnOminous, @Null, and several others (including myself) chimed in on how Google and friends doing this was going to inevitably result in legal ramifications, since they're essentially curating content on the website and ergo potentially ceding their safe harbor restrictions. In that same vein, more recently, several commentators, such as Mister Metokur, suggested that if we're going to lose any ability to say anything on the internet, we may as well weaponize Net Neutrality.

Well, apparently at least one major activist organization agrees:




Additionally, at least one channel who has contacted Youtube asking why their videos were demonetized has been told that they need to "make their content age-appropriate for a five-year old" to be suitable for monetization.

Five year olds shouldn't be watching Youtube. But Youtube has replaced TV for many people. So plopping lil' Billy in front of Youtube all day is a thing that happens a lot. So many dumb parents give their kid the phone or tablet and there you go. Babysitting.

Five year olds aren't in the market for products. They don't have any money. Advertisers aren't aiming directly at them. However, they may not want mommy and daddy to see bad videos being sponsored by Coke.

But how many people actually see an ad being run on something that offends them and automatically assume the sponsor supports the offending idea? I don't. Because I think of ads as annoyance clusters and usually don't pay attention to them. Maybe more conservative types that think anything not sponsored by Jesus himself is the devil incarnate.

Maybe the fact that so many kids are walking around glued to Youtube all day has raised the alarm. And we've got to go through the same motions television did. But it does feel a bit more sinister than bleeping out "damn" on the Late Movie because some kids might still be awake.

It sure does feel like the internet dark ages is upon us. I don't want to sperg out over it. But I'm not very happy about all of this either. You give an inch and they take a mile. All of these people whining about wanting the internet to be a safe space with a big "No Bullying" sign plastered over it don't realise that once they remove what is triggering them the powers that be are coming for them next. I've said before that people like this are blind. They are in "woke" comas. Censoring the internet has been a huge desire since its inception. And SJW crybabies are nothing more than tools to pave the way.

The people that are happy with the policies at places like Twitter and Youtube are just behind everybody else on the chopping block. But stay woke guys.
 
Nuke YouTube. We really need to start "demonetizing" this entire fucking entity, and Google itself. Anything that deprives these shit companies of income is a good thing. I don't know if anyone has a good plan for this, but it's long overdue.

I already aggressive adblock YouTube and won't even watch something with an obligatory ad, but it's time to take this approach to Google itself.

Not one thin dime to these cocksuckers.
 
Five year olds shouldn't be watching Youtube. But Youtube has replaced TV for many people. So plopping lil' Billy in front of Youtube all day is a thing that happens a lot. So many dumb parents give their kid the phone or tablet and there you go. Babysitting.

Five year olds aren't in the market for products. They don't have any money. Advertisers aren't aiming directly at them. However, they may not want mommy and daddy to see bad videos being sponsored by Coke.

But how many people actually see an ad being run on something that offends them and automatically assume the sponsor supports the offending idea? I don't. Because I think of ads as annoyance clusters and usually don't pay attention to them. Maybe more conservative types that think anything not sponsored by Jesus himself is the devil incarnate.

Maybe the fact that so many kids are walking around glued to Youtube all day has raised the alarm. And we've got to go through the same motions television did. But it does feel a bit more sinister than bleeping out "damn" on the Late Movie because some kids might still be awake.

It sure does feel like the internet dark ages is upon us. I don't want to sperg out over it. But I'm not very happy about all of this either. You give an inch and they take a mile. All of these people whining about wanting the internet to be a safe space with a big "No Bullying" sign plastered over it don't realise that once they remove what is triggering them the powers that be are coming for them next. I've said before that people like this are blind. They are in "woke" comas. Censoring the internet has been a huge desire since its inception. And SJW crybabies are nothing more than tools to pave the way.

The people that are happy with the policies at places like Twitter and Youtube are just behind everybody else on the chopping block. But stay woke guys.
See, you understand this perfectly.
 
Additionally, at least one channel who has contacted Youtube asking why their videos were demonetized has been told that they need to "make their content age-appropriate for a five-year old" to be suitable for monetization.


Yes that is how companies are run. That is what I have been saying. It's lame yes, but not evil. Next time listen to me instead of filling your head with sjw conspiracies.
 
Yes that is how companies are run. That is what I have been saying. It's lame yes, but not evil. Next time listen to me instead of filling your head with sjw conspiracies.
SJWs are at the forefront of pushing puritanism and pearl clutching on front of corporations, no one cares until it becomes a problem, or at least someone virtue signals about it.
 
  • Like
Reactions: Technicolor_Sheep
Back