Study: When Debunking Scientific Myths Fails (and When It Does Not)

mrfintoil

Senior Member.
But bursting mythical bubbles can also backfire. The first problem is that people are easily persuaded by things they hear more often. “The mere repetition of a myth leads people to believe it to be more true,”
[...]
unfortunately, our brains don’t remember myths in a very helpful way. “There’s a lot of research that tells us people have a hard time remembering negations,” says Stephan Lewandowsky, a cognitive scientist at the University of Bristol in England. We remember myths not as myths, but rather as statements that are additionally tagged as “false.” So instead of remembering “cheese is nothing like crack,” our brains remember “cheese is like crack (false).” As our memories fade, the qualifier on the statement may fade too
[...]
Peter says the results suggest that when presenting readers with new information, “try to avoid repeating false information,” since that may be what remains in people’s minds. And in some situations, Peter says, asking readers for their opinion or getting them to form an opinion as they read might help them distinguish between what is truth and what is myth.
Content from External Source


Link to paper (paywalled unfortunately) :
http://scx.sagepub.com/content/early/2015/10/23/1075547015613523.abstract

Paper summarized here:
https://www.sciencenews.org/blog/scicurious/sometimes-busting-myths-can-backfire
 
Interesting. This would suggest is very important (on a one-on-one basis) to get people to express some opinion on your debunking before moving on to the next point.

And if agreement is very far out of reach, then one should attempt to avoid the backfire effect by taking baby steps - start with something fairly close to common ground, and emphasize that accepting this idea will not invalidate their world view.

Example: "While we can't prove that all trails are just contrails, people were discussing contrails spreading out and covering the sky all the way back to the 1940s".
 
We remember myths not as myths, but rather as statements that are additionally tagged as “false.” So instead of remembering “cheese is nothing like crack,” our brains remember “cheese is like crack (false).” As our memories fade, the qualifier on the statement may fade too
Does that mean that making thread titles like "Debunked: something" may be counterproductive?
Or maybe "Debunked: something" is fine because it starts with the word "debunked", but "Something (debunked)" would be counterproductive?
 
Does that mean that making thread titles like "Debunked: something" may be counterproductive?
Or maybe "Debunked: something" is fine because it starts with the word "debunked", but "Something (debunked)" would be counterproductive?

Well, according to the study "Debunked: something" could be problematic:

instead of remembering “cheese is nothing like crack,” our brains remember “cheese is like crack (false).” As our memories fade, the qualifier on the statement may fade too
Content from External Source
But there is a lot more into debunking than thread titles. I don't think what the title says affects believers to any significant degree. But I think it's important to consider that the study talks about scientific myths, not conspiracy theories. Believers in conspiracies don't keep believing because their memory fades, I think that part of the study applies more to the general public and conventional misconceptions about things. You only use 10% of your brain. Carrots make you see better in the dark. Those kinds of things.

When it comes to conspiracy believers it is in my opinion the clustering of illusionary evidence, and the underlying psychological aspects of the conspiracy mindset that traps believers.

About the psychological aspects, this study shows a correlation between low self-esteem and conspiracy thinking tendencies:

- Does Self-Love or Self-Hate Predict Conspiracy Beliefs? Narcissism, Self-Esteem, and the Endorsement of Conspiracy Theories
http://spp.sagepub.com/content/7/2/157.abstract

My own observations reaches the same conclusion, people with low self-esteem, or narcissism, are drawn to conspiracy theories because they provide feelings that both personalities need. Belief in conspiracy theories gives a sense of overview and control, to be part of the heroic minority that fights against evil when the dumb masses are just bumming around. Both feelings are appealing to those with low self-esteem, and narcissism. To accept the conspiracy theories as false these individuals also need to abandon the positive feelings that comes with these ideas, thus they have developed an emotional dependency on what they believe.

There is also the problem with what I call the clustering of illusionary evidence. Few believers only believe one conspiracy theory, it is very common that if there is belief in one, there is belief in many. Conspiracy thinkers always have some other claim or theory to fall back on if the first one is rebutted. Because of the often superficial understanding of each claim, it's more about quantity than quality so to speak. But superficial understanding can give bad claims the appearance of being evidence, especially when the claims comes in large numbers.

Did you know eyewitness heard bombs at world trade center? The missing 2.3 trillions from Pentagon? Larry Silverstein said "pull it". They found nano thermite in the rubble. No plane hit Pentagon. Israeli spies were dancing as the towers fell. Jet fuel can't melt steel beams. And the list goes on.

That is also just one topic. Add "chemtrails", global warming hoax, false flags, GMO and vaccine dangers, secret societies ruling over us. I understand why conspiracy thinkers have such hard times quitting their mindset.
 
Back
Top