Culture Youtube gonna be tougher with content control, working with organizations such as ADL - Pepe is gonna be banned.

https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html

A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.

We wanted to give you an update on these commitments:

Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.
We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.

More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.

Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.

Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.

And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.

Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.

The YouTube Team

This gonna be gud.
 
Unless they start losing money (more than they already do by running YT at loss), Google won't give a shit.

True, if I recall correctly they don't even actually interact with anyone who isn't a ~super popular YouTuber~ or whatever, which shows how much they care.

I think the problem at this point (for the userbase) is that there isn't really a replacement. You can say the other sites like VidMe exist all you want but a mass exodus seems difficult to achieve. And as mentioned in this thread normies will still use YouTube anyway.
 
True, if I recall correctly they don't even actually interact with anyone who isn't a ~super popular YouTuber~ or whatever, which shows how much they care.

I think the problem at this point (for the userbase) is that there isn't really a replacement. You can say the other sites like VidMe exist all you want but a mass exodus seems difficult to achieve. And as mentioned in this thread normies will still use YouTube anyway.
And even being a "Super Popular" YouTuber doesn't guarantee a response. The problem is that Google made a SJW version of Skynet and it won't stop until it REEEEEEEEEEs everyone/thing off the internet.

Even if the normies stay, whose to say they won't be the next ones targeted?
 
True, if I recall correctly they don't even actually interact with anyone who isn't a ~super popular YouTuber~ or whatever, which shows how much they care.

I think the problem at this point (for the userbase) is that there isn't really a replacement. You can say the other sites like VidMe exist all you want but a mass exodus seems difficult to achieve. And as mentioned in this thread normies will still use YouTube anyway.

There's alternative sites, it's just none of them has been big enough yet.

Operative word being yet.

We've seen more than a few times over the last decade that no amount of monopolization can save you if you fuck up your platform hard enough. This right here, if it does turn out to be as bad as the reports make it sound (and we may luck out, with Google deciding "fuck it" in the face of massive blastback; sure as shit wouldn't be the first time), it could very well what begins the inevitable. Google's handling of the Adpocalypse essentially paved the way for a lot of content creators to swap to alternative funding models (crowdfunding or direct sponsorships), so this means a lot of the influence Google'd otherwise have in this situation they simply don't. Even more so, that incident already killed any goodwill Google might have built up over time.

But it actually gets more incompetent than the obvious and the corresponding massive loss of revenue this will cause (see also: Twitter; Youtube's run at a loss for a while but if it starts facing mass defections of its biggest and best, it will eventually face the economic consequences eventually). I mentioned it earlier in this thread, but this is virtually identical to something LiveJournal tried previously, that almost killed the site.

How did this happen?
By essentially destroying Livejournal's own safe harbor restrictions.

See, early in 2017, the 9th U.S. Circuit Court of Appeals ruled that moderators on websites can be deemed agents of a website if they potentially have enough requisite knowledge of infringements to undercut an internet service provider's safe harbors. In this particular case, Google, via curating a specific community using these new policies, essentially is taking ownership of the site as a whole, and in essence, making themselves liable for any copyright infringement that then takes place on their curated platform.

With this in mind, it's not a matter of if this policy is going to fuck up, but when. Odds are better than break-even that this going to end in one of four ways, presented in order of ascending likelihood:

1. Google goes through with this, causes a mass defection of users to other platforms, is forced to recant the entire fucking policy lest it lose more and more of its userbase to a competitor.

2. Google goes through with it, gets the shit sued out of it, is forced to recant the entire fucking policy lest it get Livejournal'd.

3. Google pre-emptively aborts the whole fucking thing and pretends it never had this idea. Again, not a first for Google.

4. Google edits the ever-loving shit out of the policy to be less-obviously shit to the point of undermining why they were instituting it in the first place, inevitably putting out a toothless policy that accomplishes fuck all but posturing.
 
Honestly I feel like Google's still going to go through with this because they're idiots and I have no faith in them.
I mentioned it earlier in this thread, but this is virtually identical to something LiveJournal tried previously, that almost killed the site.

How did this happen?
By essentially destroying Livejournal's own safe harbor restrictions.

I don't actually know what happened with LiveJournal except that it got bought by Russia or something, which makes me wonder if YouTube will ever be sold.
 
I don't actually know what happened with LiveJournal except that it got bought by Russia or something, which makes me wonder if YouTube will ever be sold.

Short version, near as I understand it: Livejournal set up almost exactly what Youtube is trying with the new rule changes, with a team of volunteer moderators attempting to enforce rule compliance site-wide. A particularly litigious group of Paparazzi, Mavrix photographs, targeted Livejournal because of its photos appearing on a community page, and Mavrix skipped the usual steps and went right for suing Livejournal. Livejournal responded with safe harbor, to which Mavrix countered by pointing out that Livejournal had instituted a policy recently with volunteer moderators curating posts to make sure they were complying with Livejournal's new rules, ergo meaning they had to have knowingly green-lighted the placement of copyrighted material.

Near as I can tell (Kiwi Lawfags, help a brother out) the 9th circuit argued that because of how Livejournal had handled this policy revision, it had taken possession of the site and thus was liable. Suffice to say, the MPAA and RIAA immediately began salivating at the chance of getting a nibble of Livejournal, and this very quickly brewed into a very, very costly mistake for Livejournal as a whole (as they were bought out by the Russians later).
 
Near as I can tell (Kiwi Lawfags, help a brother out) the 9th circuit argued that because of how Livejournal had handled this policy revision, it had taken possession of the site and thus was liable. Suffice to say, the MPAA and RIAA immediately began salivating at the chance of getting a nibble of Livejournal, and this very quickly brewed into a very, very costly mistake for Livejournal as a whole (as they were bought out by the Russians later).
Correct. Editorializing content in any way, especially when changing the meaning of the message, makes you liable for it. I'm more familiar with defamation than copyright in this regard. We have forum subtitles but mods are not allowed to edit those to include any statements different from OP's unless they are absolutely sure, and those changes are logged to the moderator making the action. The only time I edit titles/subtitles is for grammatical issues, though sometimes I remove statements that I do not believe are substantiated in the thread. This is both a legal precaution and one of general principle; both against outright defamation and against lying to my audience.

It's worth nothing that something like 92% of 9th District cases that make it to the USSC get overturned. The 9th District is a circus.
 
I really wish there weren't so many monopolies online (let's be honest here, google/youtube/facebook/twitter/etc have no real competitors).

When a company moves away from being apolitcal, it SHOULD hurt them a lot more than it does.
Nobody ever calls them out on these problems and it's difficult for any new upstart to try to muscle into an already large pond.
 
If they do this with political shit and it favors one candidate over another, and you know it will, I hope they're ready for the FEC to get up in their shit for illegal campaign contributions.

This reminds me of a general potential problem with the concept of net neutrality now that entities like Google are acting as political entities on a global scale and are, arguably, themselves a greater threat to freedom than the government itself.

If entities like this are going to get carte-blanche restrictions on liability and things that are generally part of economic competition (like net neutrality could in theory be), they should have to act like the utilities they're pretending to be. They shouldn't be allowed to interfere with the free speech of people using them, just like the electric company can't shut off electricity to businesses owned by Republicans or Democrats or even Nazis just for their political views.

They're either something akin to a utility, publicly regulated and having to provide services to everyone equally, or they're something else, in which case they should act like a business, and you should be able to treat them equally and if, for instance, you're an infrastructure provider in competition with them, cut down their Internet just like they can kick you off because they disagree with your opinions.

You don't have the phone company listening into your calls to decide whether you're allowed to say whatever you're saying to your SO or whatever.
 
Last edited:
If they do this with political shit and it favors one candidate over another, and you know it will, I hope they're ready for the FEC to get up in their shit for illegal campaign contributions.

This reminds me of a general potential problem with the concept of net neutrality now that entities like Google are acting as political entities on a global scale and are, arguably, themselves a greater threat to freedom than the government itself.

If entities like this are going to get carte-blanche restrictions on liability and things that are generally part of economic competition (like net neutrality could in theory be), they should have to act like the utilities they're pretending to be. They shouldn't be allowed to interfere with the free speech of people using them, just like the electric company can't shut off electricity to businesses owned by Republicans or Democrats or even Nazis just for their political views.

They're either something akin to a utility, publicly regulated and having to provide services to everyone equally, or they're something else, in which case they should act like a business, and you should be able to treat them equally and if, for instance, you're an infrastructure provider in competition with them, cut down their Internet just like they can kick you off because they disagree with your opinions.

You don't have the phone company listening into your calls to decide whether you're allowed to say whatever you're saying to your SO or whatever.
Net neutrality has to do with the regulation of ISPs as common carriers, not the actual customers of those ISPs such as YouTube.
 
I think you guys overestimate the importance of traditional content creators on youtube.

If they all left right now, I bet the dent on the hundreds of millions of views that the toy reviewers, minecraft let's players, "safe normie comedy" creators, VEVO and Co., creepy baby videos with Elsa and Spiderman and so on would be minimal, at best. Even PewDiePie barely manages to get a fraction of all his subscribers to watch any given video and most of them probably skip/adblock the ads, while stuff aimed at kids and normies always pays off in the end for them (full ads viewed and repeated multiple times).

In fact, I bet Youtube execs can't wait for controversial content creators to go away, so that big companies can pay for their ads without worries.

I'd say that in the end big companies would catch up to the fact that the only people watching their ads are dumb babies left with an iPad as a baby sitter, but official music videos probably are appealing enough to make them not care.

I personally feel that in 10 years, the landscape won't have changed much, aside for even less traditional content creators and more spammy stuff listed above. It will be TV 2.0 or something very close, I can bet on it. Also, no real "big" competitor will ever arise, only "decent" alternatives that will come and go.

I'd be more optimistic if Youtube was still an independent company, but Google's money will keep it afloat and "anti-controversy" basically forever (or until Internet will be replaced by Brainternet or something). This is my opinion, anyway.
 
So, what should we do? YouTube has no serious competitions and all other alternatives are much smaller and likely will not make a dent on YouTube's profit.

The only thing I can think of are putting an AdBlock on YouTube so that they will not receive money and finding a video site that respects content creators that makes "controversal" videos. Expressing outrage at YouTube alone will not do anything.
 
Yeah, make no mistake, all of this pisses me off to a quantifiable degree, but in no way am I surprised, nor am I going to be flailing around about it.

Google is a company, and therein it want to make money. It'll cater to you as long as it doesn't negatively affect its bottom line. And unless I've missed something recently, Youtube has always been a giant money and resource hog for Google. It's probably going to continue to be so, so of course it'll want to ensure at least something to keep it floating, or at least reasonable to keep around. And sorry to break it to you, but unless your favorite channels are either directly connected to big companies like Sony, NBC, etc. - or essentially tangential advertising channels (i.e. - WatchMojo or What Culture Top 10 lists, game reviewing channels, or let's plays), then Google isn't really interested in helping anyone who doesn't make a return on their investment.

This isn't to say "just give up," but realize this isn't shocking that the powers-that-be are once again trying to stifle free thought in entertainment and popular media. Time to head back to the underground where we can actually make some difference.
 
I'm trying desperately to find some kind of silver lining in all this, but it's just blackpilling as shit. I tried using vid.me to get a feel for what to expect in the future, but in all honesty it's kind of clunky and not as smooth as YouTube. I found this article that at first gave me a glimmer of hope, but after reading it they seem to equate terrorism with anyone that has a differing opinion or tries to break The Narrative.

http://fortune.com/2017/08/01/youtube-machine-learning-terror-content/

Don't get me wrong, I've read some pretty reassuring responses to this shit from you guys, but let's be real. Things will never be the same after this. YouTube will end up being just another Jewish Golem pumping out shit for normies that will be bland and unfunny. Unless there's an alternative that everyone can agree on, I can't foresee the skeptic community (For all their flaws, they're a bastion of free speech) or alternative media being all on one site together.
 
Back