Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation

Mick West

Staff member

How do you debunk conspiracy theories effectively? Based on what I've seen over the years what works for people is a combination of finding out things they believed to be true were actually false, seeing people they used as sources of information give out wrong information, and being exposed to new information that gives them a more realistic context in which they can figure things out for themselves. So those are the things I focus on.

Some of this ad-hoc approach is validated by the study Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation, published in Psychological Science

As the researchers reported: “A detailed debunking message correlated positively with the debunking effect. Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.”

However, Albarracín said the analysis also showed that debunking is more effective – and misinformation is less persistent – when an audience develops an explanation for the corrected information.

“What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect,” she said.

For news outlets, involving an audience in correcting information could mean encouraging commentary, asking questions, or offering moderated reader chats – in short, mechanisms to promote thoughtful participation.

The researchers made three recommendations for debunking misinformation:

  • Reduce arguments that support misinformation: News accounts about misinformation should not inadvertently repeat or belabor “detailed thoughts in support of the misinformation.”
  • Engage audiences in scrutiny and counterarguing of information: Educational institutions should promote a state of healthy skepticism. When trying to correct misinformation, it is beneficial to have the audience involved in generating counterarguments.
  • Introduce new information as part of the debunking message: People are less likely to accept debunking when the initial message is just labeled as wrong rather than countered with new evidence.
Content from External Source

The paper itself goes into a little more detail on these points:

Recommendations for debunking
Our results have practical implications for editorial
practices and public opinion.

Recommendation 1: reduce the generation of arguments
in line with the misinformation.
Our findings
suggested that elaboration in line with the misinformation
reduces the acceptance of the debunking message, which
makes it difficult to eliminate false beliefs. Elaborating on
the reasons for a particular event allows recipients to form
a mental model that can later bias processing of new information
and make undercutting the initial belief difficult
(Hart et al., 2009). Therefore, the media and policymakers
should report about an incident of misinformation (e.g., a
retraction report) in ways that reduce detailed thoughts in
support of the misinformation.

Recommendation 2: create conditions that facilitate
scrutiny and counterarguing of misinformation.

Our findings highlight the conclusion that counterarguing
the misinformation enhances the power of
corrective efforts. Therefore, public mechanisms and
educational initiatives should induce a state of healthy
skepticism. Furthermore, when retractions or corrections
are issued, facilitating understanding and generation of
detailed coun-terarguments should yield optimal acceptance
of the de-bunking message.

Recommendation 3: correct misinformation with
new detailed information but keep expectations
The moderator analyses indicated that recipients of
misinformation are less likely to accept the debunking
messages when the countermessages simply label the
misinformation as wrong rather than when they debunk
the misinformation with new details (e.g., Thorson, 2013).
A caveat is that the ultimate persistence of the misinformation
depends on how it is initially perceived, and detailed
debunking may not always function as expected.
Content from External Source
The research brings up the backfire effect - where debunking something simply can actually reinforce that false belief. This is a real problem, but can be addressed by the overall strategy of providing new (true) information and gently encouraging people to "do their own research" in a more rigorous and reality based framework.

"Keep expectations low" is a key point. Don't get frustrated and angry when a debunking fails to take. Give it time, let it sink in, and keep lines of communication open. Effective debunking requires time for people to come around to actual reality, and in the case of those deep down the rabbit hole this time may be considerable.

Sarah T.

New Member
I searched to see if his name has been mentioned already in these forums and was surprised to see it wasn’t. James Atherton has written quite a bit (in a pretty heavy academic style) on the psychology of learning, especially the barriers to what he calls “supplantive learning”- where the lesson is not merely going to fill a void- but new information has to replace something that is already accepted. He describes a destabilization process that most people try hard to avoid- letting go of the old beliefs to make room for the new ones: Learning as Loss.

Supplantive learning in its most general and weakest sense entails a degree of loss of competence, and perhaps of confidence, because a previous skill (or item of knowledge) has to be abandoned or rejected, while the learner is still on the learning curve with the new understanding or skill. Represented graphically, the curve of the old skill and that of the new produce a "learning trough". Frustration at this loss of competence can itself be sufficient to abort the learning process.

Read more: Learning as Loss 1
Under Creative Commons License: Attribution Non-Commercial No Derivatives
Content from External Source

Critical Thinker

Senior Member.
From : The ‘Liar’s Dividend’ is dangerous for journalists. Here’s how to fight it.

Here’s the concept in a nutshell: Debunking fake or manipulated material like videos, audios or documents, ultimately could stoke belief in the fakery. As a result, even after the fake is exposed, it will be harder for the public to trust any information on that particular topic.

This is a bigger problem than the Oxygen Theory, which argues that by debunking a falsehood, journalists give the claim a longer life. The Liar’s Dividend suggests that in addition to fueling the flames of falsehoods, the debunking efforts actually legitimize the debate over the veracity. This creates smoke and fans suspicions among at least some in the audience that there might well be something true about the claim. That’s the “dividend” paid to the perpetrator of the lie.

To dig back into ancient history for an example, in 2010 after robust reporting by almost every American news outlet that Barack Obama’s Hawaiian birth was certain, the intense debunking could not erase the doubt in the minds of a significant segment of the American public. At that point, 25% of Americans still thought it was likely or probable that Obama was born overseas. Well under half, only 42% of poll respondents, believed the facts as they had been conclusively demonstrated: that Obama was certainly born in the U.S. And 29% said they believed the president was probably born in the US. Certainly, political predisposition contributes to the existence of the Liar’s Dividend; in a polarized society, it can’t be minimized.

This is problematic for reporters and fact-checkers, and it buoys purveyors of misinformation. As NPR’s Media Correspondent David Folkenflik suggested at the symposium, “The idea is there’s just enough chum in water, it distracts people, nobody knows which to believe and they move on.”
Content from External Source

Open up the reporting process to expose the fakery
Last month, when fabricated reports about Democratic presidential hopeful Pete Buttigieg emerged, several news organizations were quick to reveal the phony tips they’d been receiving. Among them was the Daily Beast, which pulled back the curtain for its audience to reveal a surreptitious recording made by one source and its interactions with a college student dragged into the plot as an alleged accuser.

It was slightly reminiscent of The Washington Post’s 2017 story on the attempt by Project Veritas to trick The Post into reporting fake allegations against Senate candidate Roy Moore.

By showing that good reporting starts with skepticism and isn’t predisposed to believe a tip until it can verify it, the news organizations diffused the fake news of both those stories, mitigating any Liar’s Dividend.
Content from External Source


Senior Member.
By showing that good reporting starts with skepticism and isn’t predisposed to believe a tip until it can verify it

pfft. like the kids from Coventry story? If newspapers would stop being so agenda driven/bias and be more careful to mark opinion pieces as "opinion", then maybe the general public wouldn't distrust the source so much.

and what was that other doozy the Washington Post printed? ProporNot.

Proper journalistic standards wont help the extreme believers of course, but it would help bunk from spreading in the more reasonable general public. Of course, from a business viewpoint I'm not sure any news outlet with proper journalism standards could survive now adays in this "click bait driven" society.

edit add: I just read the full article, and everything I just said is what the article is actually saying. a good example how you can misunderstand intent with quotes out of context.
Last edited:


Active Member
Having issues with the QUOTE function on my tablet, but trying to reply to @Dierdre post directly above.

Strongly agree that both flagging Opinion pieces and practicing proper journalism will solve a lot of the problems. A couple of years ago, I was fed up with the amount of complete garbage news articles in my various aggregated (like and, etc.) and direct/creators like MSNBC, CBS, FOX, etc.) and decided to create a web crawler that would rate them by journalistic standards. My first step was completely manual.

I had a list of 6 sites (top 6 from a web search for most popular news sites) that I visited every day at a specific time each day and read and rated the first “N” articles on each site. They were scored by whether the article covered the basics of: who, what, when, where, why, and how. The ideas was to score one point for each “hit”, which would tell you how well they covered their topic. The scores were supposed to roll up to a site score for reliability.

After a week, I had to rethink the testing methodology because NOT ONE of the articles contained more than 3 or 4 of the basic, required elements that I learned in journalism class. The new categories added scores for things like: thinly-veiled-advertisements-masquerading-as-news, stubs (intro paragraph with a link to find out more), clickbait, person-bashing, copied/repeated from another article, no links to sources of claims, and on and on and on.

The research stopped after the second week because none of my news sources had produced a single news article that met my primary journalistic rating criteria. In other words, not a single article met the criteria that I had to meet when I wrote articles for the school newspaper ‘way back in 19~~. The project to identify adherence to journalist integrity was pointless and trying to identify clickbait through programmatic means was beyond my scope.

My conclusion is that, while there are sites with journalistic integrity, they are not those that are most popular. The click-bait approach to journalism has dramatically lowered the amount and quality of information in our online media. Correspondingly, countering the need for clicks with facts is a challenging problem. Metabunk and @Mick West interview outreach are excellent. Google search results often drive people to this site which makes every attempt to focus on facts. A concerted effort on the part of the Metabunk community to comment on online articles and link to Metabunk threads will surely, gradually, increase the knowledge base of the media, but it’s a long, arduous journey.

To tie this back to the topic, referral to, and referencing of, threads here is probably the best definition of “expectations low”. If this site, or any others like it, makes anyone think twice about bunk, then it has been worthwhile.
Last edited: