In a statement issued on Friday, YouTube told BuzzFeed that it would stop showing advertisements on channels that support anti-vaccination as this type of content is in clear violation of the company's policy that prohibits "the monetization of videos with dangerous and harmful content". Earlier this week, BuzzFeed reported on a similar issue which instigated many companies to pull their adverts from misinformed or inappropriate videos or off YouTube altogether.
Previously, YouTube had also been recommending uninformed vaccine conspiracy content to viewers watching actual health-related videos. Although information panels did appear on anti-vaxxer videos, they only provided basic info about the diseases prevented via vaccines, with links to Wikipedia for further details. To amend this, YouTube has promised to alter its algorithm to prevent such recommendations. It has also started to display a new, more informative panel about vaccines in health videos, as well as another more specific information panel about "vaccine hesitancy" linked to a Wikipedia page which appears under most of the anti-vaccination videos that advocate false medical information on the platform.
Recently, many of the biggest social media companies out there have been under fire for promoting false news and incorrect information. Following a report made by The Guardian which reported that anti-vax supporters spread misinformation on Facebook via personal groups, Rep. Adam Schiff addressed YouTube's owner, Google, and Facebook, questioning them regarding their policies pertaining to anti-vaccination theories and the spreading of inaccurate medical information on these platforms.
Perhaps the importance of controlling the flow of properly informed news and info was made evident after the events of the US 2016 elections when the scandal with Facebook and Cambridge Analytica came to light which allegedly meddled with the turnout of the election, due to which the state of Washington sued Facebook and Google. After two long days of intense grilling and scrutiny at the hands of the US Congress, Facebook was investigated heavily by the Federal Trade Commission, the EU, and the British government.
Since then many media companies have taken steps to prevent international influence in domestic, political and social issues through the circulation of misinformation and fake news. Facebook-owned messaging application WhatsApp, for example, set a limit on forwarding messages. Pinterest, an image-based social media platform, blocked the word "vaccines" from its search engine so that the query does not display any results whatsoever.