At least one in four most popular English-language YouTube videos about Covid19 had misleading content, according to a study published in BMJ Global Health. Researchers defined misinformation as any video that contained false information on the transmission, symptoms, prevention tips, and treatments for Covid19.
For the study, researchers from the University of Ottawa analyzed 69 of the most popularly viewed English language videos and found that 19 of them, with a total of more than 62 million views, contained non-factual information. They found videos by government agencies to be the most accurate, however, they didn’t garner views or popularity from viewers.
Among the 19 misleading videos, around a third came from entertainment news and the rest from network and internet news sources. They contained inaccurate information such as: pharmaceutical companies already have a cure, but refuse to sell it, inappropriate prevention techniques, racist and discriminatory remarks, and conspiracy theories.
“This is particularly alarming, when considering the immense viewership of these videos,” the researchers wrote in the paper.
Given YouTube’s size and continued growth, researchers also said misinformation about Covid19 has reached more people as compared to previous health crises such as H1N1 and Ebola. “Evidently, while the power of social media lies in the sheer volume and diversity of information being generated and spread, it has significant potential for harm,” they added.
Not very long ago, in mid-February, Tedros Adhanom Ghebreyesus, director-general of the World Health Organization had said, “We’re not just fighting an epidemic; we’re fighting an infodemic,” referring to fake news that is probably spreading, “faster and more easily than this virus.”
Related on The Swaddle:
Facebook To Redirect Users Who Engaged With False Covid19 Posts to WHO’s Myth Busters Page
In response to this ‘infodemic,’ the WHO created a team working with media companies such as Facebook, Google, Pinterest, Tencent, Twitter, TikTok, YouTube, and others to counter the spread of rumors, The Lancet reported.
Earlier in April YouTube had announced it will reduce the amount of content spreading conspiracy theories, along with removing videos that breach its policies. It also said content that is based on conspiracy theories but doesn’t mention coronavirus will still be allowed on the site, The Guardian reported. Such videos, the platform said, may be considered as “borderline content” and may face suppression such as loss of advertising revenue.
It may be on these companies to do the best they can to fight misinformation, but it is important to remember that the onus also lies with social media users themselves to consume critically and share carefully.