To fight the ‘infodemic’ of fake news that has raged during the Covid19 pandemic and endangered lives, YouTube recently announced it will remove all videos related to coronavirus vaccines that contradict statements from respected, legitimate health institutions, like the World Health Organization or U.K.’s National Health Service. Specifically, videos that allege vaccines kill people, that vaccines insert microchips in human bodies, or that vaccines cause infertility, along with other false claims, will be removed.
“A Covid19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a Covid19 vaccine from the platform,” a YouTube spokesperson told the BBC. The move also counters the rising tide of anti-vaxxers that has flourished during the pandemic and has scientists prematurely worried about people’s safety even after a safe, efficient, and accessible vaccine is made available.
YouTube’s move comes on the heels of Facebook reiterating its commitment to tackling vaccine misinformation in ads on its platform — a goal it has tried to accomplish for the past year, and, arguably, has failed at. Facebook’s latest announcement, however, left a loophole that allows ads to advocate against vaccines as long as they contain a political message — “for or against legislation or government policies around vaccines, including a Covid19 vaccine, are still allowed,” the company said, as a caveat that has been criticized.
Related on The Swaddle:
Anti-Vaxxers Are the ‘Biggest Threat’ to Controlling Outbreaks, U.S. Researchers Warned in 2019
This raises the issue of how Covid19 misinformation, beyond vaccine-related fake news, is packaged. According to Facebook, any political ads discouraging people from getting vaccinated, or claiming vaccines shouldn’t be mandatory, will be allowed to exist on the platform, even if they contain misinformation. This could push anti-vaxxers to couch false claims in political messaging, essentially getting the same message across: that vaccines are dangerous.
Both YouTube’s and Facebook’s recent announcements signal a step in the right direction but are not without issues. The Covid19 pandemic has not just revealed flaws in the ways social media platforms handle misinformation but has also shed light upon how medical institutions around the world define misinformation, with organizations such as the WHO having to backtrack on several Covid19 claims as study after study reveals differing, often contradicting nuances in the way the novel coronavirus operates. In light of this uncertainty, YouTube’s policy enacted earlier in the pandemic — to remove videos with “medically unsubstantiated claims” during the pandemic — also proves a thorny issue, with Covid19 claims being proven and disproven week after week.
In the past, platforms like Facebook have been hesitant to enforce aggressive censorship, even on false or misleading claims, expressing concerns over bans alienating groups such as anti-vaxxers and pushing them further into their own beliefs. As a way around this problem, social media platforms have introduced several measures to provide corrections, warn users against posting misleading information, and direct them to legitimate news websites, leaning into a corrective, rather than a censorial, approach.
These new policies, however, do signal an acknowledgment of the infodemic. What policies will actually work in the long term, however, remains to be seen.