Culture Youtube gonna be tougher with content control, working with organizations such as ADL - Pepe is gonna be banned.

https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html

A little over a month ago, we told you about the four new steps we’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.

We wanted to give you an update on these commitments:

Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
  • Speed and efficiency: Our machine learning systems are faster and more effective than ever before. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.
  • Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
  • Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down.
We are encouraged by these improvements, and will continue to develop our technology in order to make even more progress. We are also hiring more people to help review and enforce our policies, and will continue to invest in technical resources to keep pace with these issues and address them responsibly.

More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.

Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.

Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.

And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.

Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.

The YouTube Team

This gonna be gud.
 
If we've learned nothing else this decade, it's that nothing is too big or powerful to fail. Ultimately, there's a limit of how often one can screw up, and Youtube is constantly pushing further and further into the region where not only is an alternative going to become more and more viable, but indeed, inevitable. All that needs to happen is for them to keep fucking up the way they have and then doubling down.

I hate to say this but nothing could be as big as Youtube, not unless they have resources equal to Google and they're hemorrhaging money. DivX were pre-Youtube and did it better but ended up closing shop because they couldn't afford the costs of supporting HD videos.
 
I hate to say this but nothing could be as big as Youtube, not unless they have resources equal to Google and they're hemorrhaging money. DivX were pre-Youtube and did it better but ended up closing shop because they couldn't afford the costs of supporting HD videos.

They don't have to be as big as Youtube to become a potential competitor.

They don't even need to be close. They just need to do a similar service better and the rest will, given time, write itself. It's a matter of the fucking market - If someone does the service you're fucking up better, then you can't be surprised when people check out the alternative.

Bear in mind, Google's been in and out of hot water over the last year alone for advertising fraud, so if anything the problems endemic to the platform lately are probably significantly worse than Youtube is actually letting off. Given how much money they blew on boondoggles like Youtube Red, I'm not fucking surprised.
 
If we've learned nothing else this decade, it's that nothing is too big or powerful to fail. Ultimately, there's a limit of how often one can screw up, and Youtube is constantly pushing further and further into the region where not only is an alternative going to become more and more viable, but indeed, inevitable. All that needs to happen is for them to keep fucking up the way they have and then doubling down.

In some manner some tech companies that were once giants have fallen. They're not completely dead but, they're mostly irrelevant compared to how powerful they once were. AOL was basically THE INTERNET back in the 90s for most people. They changed the way people got online. These days, people still use them for dial-up access (according to searches they still have 2million people subscribed) but, their website looks like a ghost town.
Yahoo is not in good shape. They're pretty far behind in the actual number of searches though. They've also acquired over 100 companies but, none have made them that much money.

Myspace and friendster still exist. Friendster is just a social media platform for web based games these days and Myspace just sort of rots the same way geocities did before yahoo killed it.

If Youtube/Google does die, it won't simply vanish one day. Instead, you'll see some other company slowly over take one of it's divisions. Maybe duckduckgo will become a preferred search engine over google. Maybe google will decide to further sanitize it's search engine to the point where safe search is no longer an option. instead it'll always be on with no choice to turn it off. Either that or they might make a few bad business decisions. buy a few companies that really aren't going anywhere or perhaps just lose focus on what made them money. After all Blockbuster wasn't killed by Netflix. Blockbluster killed itself by ignoring their video selection and focusing on being some kind of gas station that happened to sell movies and video games.

If Google does continue to dominate the internet through their services I suspect what will happen is that the more controversial sites will end up on the "darknet" (unindexed sites) or some sort of alternate index as oppose to "clearnet" which will be for either business related sites, or more 'safe' sites.
 
Well, this is an interesting development: Apparently, the Trump Administration is preparing to regulate the shit out of Google, Twitter, and Facebook due to their being too big and too influential politically. According to the article, a lot of this has to do with all three platforms insisting that the RUSSIA CAUSED TRUMP angle was legitimate, including people within Google choosing to push this idiocy:

This week, the Alphabet Inc. unit disclosed new information that could further roil the regulatory picture: revelations that Russian-linked accounts used its advertising network to interfere with the 2016 presidential election. The news put Google in the company of Facebook Inc. and Twitter Inc., both of which are embroiled in the controversy surrounding Russia’s involvement in last year’s U.S. elections. Executives at all three companies are scrambling to respond.

....Which comes directly after these same groups cooperated with the Anti-Defamation League to create a "cyberhate problem-solving lab."

 
Well, this is an interesting development: Apparently, the Trump Administration is preparing to regulate the shit out of Google, Twitter, and Facebook due to their being too big and too influential politically. According to the article, a lot of this has to do with all three platforms insisting that the RUSSIA CAUSED TRUMP angle was legitimate, including people within Google choosing to push this idiocy:

Considering all three of those corporations could easily just move out of the country entirely (or at this point even form their own nuclear defended micronation), I welcome the Armageddon as a bunch of fucknuggets who all suck could cause fighting each other to the death.
 
Considering all three of those corporations could easily just move out of the country entirely (or at this point even form their own nuclear defended micronation), I welcome the Armageddon as a bunch of fucknuggets who all suck could cause fighting each other to the death.
Still it's nice to know how long the salt shall pour because of this.
 
Back