YouTube says it will “remove content denying that well-documented violent events took place." Content promoting fascism, supremacism or Holocaust denial will no longer be hosted.
In what is being called a significant change in their policies of hate speech and misleading content, YouTube announced that it will ban Holocaust denial videos and content that denies otherwise proven events.
According to The Guardian, YouTube has decided to ban content that promotes Nazi ideology. The company also confirmed that it would no longer host videos that glorified fascist views. This move reportedly follows years of criticism over its role in spreading hate and conspiracy theories.
This @Gizmodo editor's headline game is 💯. I simply can't wait for the inevitable tweets from @TeamYouTube explaining how various videos aren't promoting Nazi views but merely "debating" them! 🙄 #NoPrideInYThttps://t.co/wlSUE1U1El— Ashleigh R (@Technicalleigh) June 5, 2019
The video-sharing website, which is owned by Google, said on Wednesday in an official blog post it would ban any videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status”.
The company noted that “it will take time for our systems to fully ramp up."
YouTube bans videos promoting Nazi ideology https://t.co/VFUt1gw90C— Alexander Hili (@alex_hili_) June 6, 2019
YouTube said its new policy would aim to “prevent our platform from being used to incite hatred, harassment, discrimination, and violence.”
YouTube will also begin to change the videos it recommends alongside "borderline content" that does not violate the company's policies.
#YouTube bans videos promoting Nazi ideology | #Technology | Content promoting fascism, supremacism or Holocaust denial will no longer be hosted https://t.co/N7VWRhPxxr #SocialMedia #HateSpeech pic.twitter.com/YK3EhgHVSj— George Roussos (@baphometx) June 5, 2019
"In January, we piloted an update of our systems in the U.S. to limit recommendations of borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat," the company wrote. "We’re looking to bring this updated system to more countries by the end of 2019."
So, I have pretty thick skin when it comes to online harassment, but something has been really bothering me.— Carlos Maza (@gaywonk) May 31, 2019
Much of the criticism has been aimed at YouTube’s algorithm-driven recommendation system, which helps keep people on the site by suggesting new videos they might be interested in. Critics have said that the system leads people to more and more extreme and conspirational videos - which is likely to encourage other users to try the same in the bid to drive up views and ad revenue.
(2/4) Our teams spent the last few days conducting an in-depth review of the videos flagged to us, and while we found language that was clearly hurtful, the videos as posted don’t violate our policies. We’ve included more info below to explain this decision:— TeamYouTube (@TeamYouTube) June 4, 2019
On Tuesday, however, YouTube said videos mocking Carlos Maza, a video producer for the website Vox, for his sexual orientation did not violate its policies. A series of videos posted by a right-wing commentator included calling Maza "an angry little queer."
YouTube to "prohibit videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion”, including videos promoting or glorifying Nazi ideology"— Rasmus Kleis Nielsen (@rasmus_kleis) June 6, 2019
Wonder how ban on such supremacist content will be enforced in fx India?
YouTube's ban on supremacist content comes a few months after Facebook said it was banning white nationalist content from its platform. That ban came two weeks after the suspect in the terror attack at two New Zealand mosques streamed part of the massacre live on the platform.