‘Pre‑bunking’ Fake News on YouTube Helped Curb Misinformation, Shows Study
Picture this: 90-second cartoon clips, with snippets from The Simpsons and Star Wars, describing in intimate detail how truth and news are manipulated online. All of this presented in simple language, without any partisan element to educate people about how frequently, and in what ways, they are misinformed. When played as advertisements on YouTube (a platform that remains a major “conduit of fake news“) right before someone watched a video, these cartoon clips helped reduce the level of misinformation on the wilderness of the internet.
This is a form of “pre-bunking,” and has the potential to revolutionize how we digest and process information online. Essentially, people are told what to watch out for, pre-emptively debunking misinformation that may arise.
The peer-reviewed study was published in the journal Science Advances. It responds to the “post-truth” age, characterized by an alarming rise of misinformation and disinformation campaigns, and a growing distrust of narratives, and a general disdain for truths and facts. The researchers, a group of psychologists at the universities of Cambridge and Bristol, likened the mechanism of pre-bunking to vaccinating people against misinformation. They work as “micro-doses” where people are informed of manipulation — an immunization technique that prevents them from succumbing to false truths in the future. Pre-bunking misinformation can be a way to rebuild media literacy, a prospective armor for people against the misinformation they see on the internet. This is also what’s called the “inoculation theory” — the psychological idea that people’s beliefs can be protected against manipulative influence in the same way a body builds resilience to disease by being exposed to threats.
Think of pre-bunking as a social psychology tool. “We can in a very apolitical way help people gain resistance to manipulation online,” said Beth Goldberg, one of the co-authors of the study.
In their experiment, the team of psychologists and experts put together five short educational animations for YouTube. They are available on the project website called Inoculation Science available globally. This one, about ad hominem attacks (when someone attacks the person making the argument rather than the argument itself), illustrates the point while featuring an example from The Simpsons. There are others about scapegoating, incoherence, false dichotomies, and emotional language — effectively introducing people to an entire genre of manipulation, purely with the purpose of priming them about how these tactics play out. “Propaganda, lies, and misdirections are nearly always created from the same playbook. We developed the videos by analyzing the rhetoric of demagogues, who deal in scapegoating and false dichotomies,” said co-author Prof Stephan Lewandowsky from the University of Bristol.
The researchers played the video as YouTube ads to about 30,000 participants. When asked the next day to identify the manipulation technique, people’s ability to recognize the form of misinformation increased by about 5% on average. They grew more conscious of misinformation in general.
Related on The Swaddle:
People Are Willing to Excuse Lies, Misinformation That ‘Might’ Come True: Study
“Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and can therefore be predicted,” said Beth Goldberg, co-author of the study. “Teaching people about techniques like ad-hominem attacks that set out to manipulate them can help build resilience to believing and spreading misinformation in the future.”
Overall, there was the promise of reducing people’s vulnerability to believing misleading claims. “The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types. This is the basis of a general inoculation against misinformation,” lead author Jon Roozenbeek from the University of Cambridge said.
Part of the reason the videos work is the information they present is “source-agnostic.” “Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated,” added Roozenbeek. Moreover, none of them used any real-life figures, only fictional characters illustrating the point.
So far, the crisis of truth has shaped debunking as a steady tactic against misinformation. While debunking remains effective, some research shows that correcting misinformation in retrospect does not always undo the damage done. Hearing a false claim and then correcting it still results in people internalizing the claim, what is called the “continued influence effect.” For instance, several reports have debunked the idea that going out with wet hair causes a cold. Yet, once heard, people tend to internalize it and shape their thoughts accordingly — irrespective of how much these falsities are disproven later. False news spreads six times faster than the truth, research shows too.
Related on The Swaddle:
Social Media Is Shaping Our Memory of Wars, Pandemic. What Will That Mean for History?
Pre-bunking misinformation is not entirely an antithesis to debunking it though. Both work to help people recognize false claims and even become averse to the entirety of the playbook of misinformation. If debunking is akin to playing a game of whack-a-mole, pre-bunking is about unplugging the machine itself.
This is arguably not an entirely novel tactic in the fight against misinformation. Inoculation videos are similar to other brands of pre-emptive fact-checking: Meta and Snapchat previously put voting resources online before the 2020 elections, and most efforts to address Covid misinformation online have taken the aid of pre-bunking.
Studies in the past, however, have shown people’s willingness to excuse lies that may eventually come true. Research has also plumbed depths to illustrate how easy it is for people to ignore deliberate lies and false truths out of religious and ideological partisanship.
Naturally, then there remain some limitations to the present study and the larger psychological tactic. For one, while this may be scalable, the sustainability of the “inoculation” is unclear. How long will someone remember that “false dichotomy” is a manipulation technique, and how resiliently will they identify it?
Two, experts have previously expressed concern that pre-bunking may sow the seeds of false beliefs or misinformation even before people are exposed to it. Fact-checking in itself is politicized, as research has shown that when people try to prove the falsity of a belief, others may become rigid in their stance, however incorrect it may be. Despite informing people about vaccine myths, there remains a robust community of anti-vaxxers online.
Three, it might not wholly be effective against far-right influencers who thrive on disinformation, according to Shannon McGregor, a senior researcher in communication at the University of North Carolina, Chapel Hill, who was not involved in the study.
“In the end, the authors propose that those worried about disinformation on social media (including YouTube) should spend more money on those platforms to run ads to protect against disinformation. In many ways, that is wholly unsatisfying for basically all stakeholders except the platforms.” The present study was funded by Jigsaw, a Google subsidiary that looks at misinformation.
Arguably, all of this is towards building a “toolbox” for fighting misinformation. Pre-bunking could be a torch, debunking a screwdriver. There are many ways to understand how lies spread, and knowing one of them leaves people indeed the wiser. It’s the same advice the pre-bunking video on false dichotomies, featuring a Star Wars clip, offers: “Only a Sith deals in absolutes.”
Leave a Comment