Google knows there’s a lot of extremist and hate-filled content on YouTube and it’s now doing more to stop those videos from gaining traction. In a blog post yesterday, Google laid out four new steps it will take to work against extremist videos on YouTube, and most of those steps expand on current systems the company has in place to identify, flag, demonetize, and essentially hide hate-filled videos.
The most nebulous of the four measures is the third listed in the blog post, which states Google and YouTube will take a “tougher stance” on videos that don’t clearly violate YouTube’s policies. The blog post describes these videos as containing “inflammatory religious or supremacist content”; those videos may not fall under YouTube’s definition of hate-speech, but now they’ll be targeted in a similar way.
“In the future these will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements,” Kent Walker, General Counsel for Google, writes in the blog post. “That means these videos will have less engagement and be harder to find. We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”