Pete Tar
Senior Member.
This is a little depressing and makes this site's purpose seem like a doomed enterprise.
Two articles discussing the 'backfire effect' and 'motivated reasoning'.
Two articles discussing the 'backfire effect' and 'motivated reasoning'.
And a longer article.
http://www.motherjones.com/blue-marble/2014/03/brendan-nyhan-backfire-effects-facts
...
presenting people with information confirming the safety of vaccines triggered a "backfire effect," in which people who already distrusted vaccines actually became less likely to say they would vaccinate their kids.
...
The study found that conservatives who read the correction were twice as likely to believe Bush's claim was true as were conservatives who did not read the correction.
...
Among survey respondents who were very pro-Palin and who had a high level of political knowledge, the correction actually made them more likely to wrongly embrace the false "death panels" theory.
...
Once again, the correction—uttered in this case by the president himself—often backfired in the study, making belief in the falsehood that Obama is a Muslim worse among certain study participants.
...
Despite these facts, only 1 out of 49 partisans changed his or her mind after the factual correction. Forty-one of the partisans "deflected" the information in a variety of ways, and seven actually denied holding the belief in the first place (although they clearly had).
...
but when Republicans read the article about the more distant farmers, their support for action on climate change decreased, a pattern that was stronger as their Republican partisanship increased.
...
Together, all of these studies support the theory of "motivated reasoning": The idea that our prior beliefs, commitments, and emotions drive our responses to new information, such that when we are faced with facts that deeply challenge these commitments, we fight back against them to defend our identities.
http://www.motherjones.com/politics/2011/03/denial-science-chris-mooney
What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."
In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers (PDF). Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.