In a blog post today, Neal Mohan, YouTube's Chief Product Officer, addressed the platform's approach to tackling misinformation on its platform. He addressed how much misinformation is prevalent in "every facet of society, sometimes tearing through communities with blistering speed."
He stated that the platform actively takes down content that violates its community guidelines, revealing that roughly ten million videos are removed each quarter and most of these videos do not even reach ten views.
Mohan said that out of billions of videos that are present on YouTube, only about 0.16 to 0.18 percent violate its policies. For reference, all such videos are removed that encourage or real-word harm. Since the outbreak of COVID-19, YouTube has removed more than one million videos related to the topic, which mostly were about false cures and claims of a hoax.
To identify bad content clearly, YouTube is dependent on a clear set of facts, which it gets from health organizations like the CDC and WHO with regards to the coronavirus. In other cases, misinformation is less clear cut as there's not often such a primary source that can identify what is correct. Mohan remarked that it is his conviction to not remove such content.
Mohan also referred to the 2020 U.S. Presidential election; YouTube let videos about widespread fraud claims stay up until states certified their election results. However, it recommended the content it deemed most trustworthy to viewers. Since then, thousands of videos violating election-specific policies have been taken down. Over 77 percent of the videos were removed before they reached 100 views.