YouTube has unveiled a more rigorous plan to curb hate speech, extremist views, and false content on its platform amid mounting criticism over its handling of potentially harmful videos.
In a blog post published Wednesday, the firm said it will be taking further steps to eliminate videos that promote violence and extremism, such as Nazi glorification and white supremacy.
It will also remove hoax videos that attempt to debunk known tragedies, like the Sandy Hook shooting and the Holocaust.
The action is expected to result in the removal of thousands of channels and videos that violate its newly established policies around supremacist content.
However, YouTube didn’t name specific content or accounts that would be affected.
The Google-owned video sharing site says the new policy will go into effect today, though it could take several months for its systems to ‘fully ramp up.’
YouTube added that it expects to expand the categories covered by the policy ‘over the next several months.’
‘Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status,’ the company wrote in post to its site.
‘This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory.
‘Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place,’ YouTube added.
The firm added that some videos ‘could remain up’ on the site if they concern topics including ‘pending legislation, [they] aim to condemn or expose hate, or provide analysis of current events.’
YouTube has increasingly come under fire for allowing extremist content and viral hoax videos to remain on its platform.
But Wednesday’s announcement marks YouTube’s most aggressive action yet to curb the spread of hate speech and misinformation on the site, which boasts more than 2 billion users.
Previously, YouTube joined a long list of tech giants in banning Infowars’ creator Alex Jones from its platform for posting content that violated its policies around hate speech and violence, as well as hoax content promoting Sandy Hoax conspiracy theories.
It follows Facebook’s recent move to ban far-right and anti-Semitic leaders from its platform, including Alex Jones and his controversial site Infowars, right-wing personalities Milo Yiannopoulos, Paul Joseph Watson and Laura Loomer, as well as Nation of Islam leader Louis Farrakhan and Paul Nehlen, a white nationalist who ran for Congress in 2018.
Moving forward, YouTube said it’s also working to get a handle on content that toes the line of violating its policies.
It’s partnering with lawmakers and experts to better manage content that doesn’t necessarily violate its policies, but could be used to spread ‘harmful misinformation.’
This builds on an a feature YouTube announced in January, which aimed to stop the spread of borderline content, like videos promoting ‘a phone miracle cure for a serious illness, or claiming the earth is flat.’
YouTube said it intends to bring this system to more countries by the end of this year.
The firm added that, so far, the changes have resulted in fewer views of harmful or false content. The videos also haven’t appeared in YouTube’s recommendation section as often.
‘Our systems are also getting smarter about what types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos moving forward,’ YouTube explained.
YouTube said it’s also tightening its monetization policies by suspending channels that ‘repeatedly brush up against our hate speech policies’ from being able to run ads on their videos.
It will also bar these channels from using other monetization features like SuperChat, which allows subscribers to pay creators for extra chat features.
A BuzzFeed News investigation recently revealed how SuperChat was being exploited by some users to promote hate speech and extremist ideas.
The move comes as YouTube on Wednesday faced criticism over its decision to leave videos mocking Carlos Maza, a Vox reporter, for being gay up on its site, even though the content appeared to violate its harassment policies.