Changing Conspiracy Beliefs through Rationality and Ridiculing

skephu

Senior Member.
A recent paper suggests (somewhat unexpectedly) that rational arguments and ridiculing can be effective against conspiracy beliefs, but emotional arguments are not.

Orosz, G., Krekó, P., Paskuj, B., Tóth-Király, I., Bőthe, B., & Roland-Lévy, C. (2016):
Changing Conspiracy Beliefs through Rationality and Ridiculing.
Frontiers in Psychology, 7.
http://journal.frontiersin.org/article/10.3389/fpsyg.2016.01525/full

Abstract:
Conspiracy theory (CT) beliefs can be harmful. How is it possible to reduce them effectively? Three reduction strategies were tested in an online experiment using general and well-known CT beliefs on a comprehensive randomly assigned Hungarian sample (N = 813): exposing rational counter CT arguments, ridiculing those who hold CT beliefs, and empathizing with the targets of CT beliefs. Several relevant individual differences were measured. Rational and ridiculing arguments were effective in reducing CT, whereas empathizing with the targets of CTs had no effect. Individual differences played no role in CT reduction, but the perceived intelligence and competence of the individual who conveyed the CT belief-reduction information contributed to the success of the CT belief reduction. Rational arguments targeting the link between the object of belief and its characteristics appear to be an effective tool in fighting conspiracy theory beliefs.
Content from External Source
 

Attachments

  • Changing Conspiracy Beliefs through Rationality and Ridiculing - data sheet 2.pdf
    114.7 KB · Views: 836
Last edited by a moderator:
The three different approaches:

In the rational condition, the text tackled the claims made in the first recording in a logically plausible manner, using numbers to support the objections, and pointing out the discrepancy between high influence and concealment. This speech pointed out the logical flaws of the first speech and corrected it with in-depth arguments regarding the link between the beliefs' objects and attributes. The goal of this condition was to emphasize the logical inconsistencies and to create a more complex and coherent relationship between the objects of the belief and the attributes.

In the ridiculing condition, the script addressed the same logical flaws, but reasoned against them differently: instead of focusing on certain details, it derided the logical inconsistencies and concentrated on those who believe in the CTs, picturing them as evidently ridiculous (e.g., mentioning the believers of Lizard Men). This text intended to increase the distance between the respondents' self and those who believe in CTs.

The empathetic condition contested the original text's claim in a different manner: instead of focusing on content or those who believe in the content, it placed the objects of the CTs in the center, and compassionately called attention to the dangers of demonizing and scapegoating, while also pointing out the human character of the CT objects (i.e., Jews face similar conspiracy theories and persecution nowadays that the Early Christians faced). This condition intended to reduce the distance between the respondent and the objects of CTs and to raise empathy toward these groups.
Content from External Source
Ridiculing works when a person is not fully into the theory. People prefer not to be part of a group they know is considered ridiculous.

But I think it important here is that what is really working (in both the "rational" and "ridiculing") is pointing out the logical flaws and inconsistencies in the conspiracy theories (i.e. debunking). Ridicule can work for some people, but it can also backfire, and there's no suggestion that it's necessary.
 
Somewhat interesting, if not very limited. weird the abstract says "N=813" when in actuality N=709 (out of 20,000 candidates).

709 participants (51.1% female), the average age was 46.43 (SD = 14.74),
Content from External Source
709 divided by 4 is 177 people per group.

and i think it's important to note the empathetic "objects" were very large (faceless) groups of people

Jews, European Union, global financial system, or bankers, etc.), which is related to a given attribute (exploitative, hidden, and manipulative, etc.)
Content from External Source
i guess i will have to, at some point, find and listen to the presented "CT" and the empathetic challenge to find out how you empathize with bankers :)

The present study is not without limitations. The effect sizes were not large. However, measuring the effectiveness of different reasoning or convincing strategies is not easy. In the present study, the number of arguments was balanced, but the length of the audio recordings was different in the different conditions. <<They listened to another speech with, either rational (3:36 min), ridiculing (3:28 min), or empathetic (2:54 min) arguments against CTs. In the control condition (3:15 min), they listened to a weather forecast.>>

Further, studies should balance the number of arguments, their length and pretest the effectiveness of each argument. Needless to say that it is a time consuming task. If we consider the present study as an intervention, it can first be said that this is not a wise one, as direct and confronting strategies were used to convince individuals regarding CT reduction.

Second, this experiment did not have the very solid theoretical background that a good intervention requires. Third, this study only measured the short-term effects of different CT reduction strategies.

Fourth, it targeted a general population instead of a specific subgroup of individuals. Fifth, the timing of the experiment was not related to a big CT-related scandal, which could have influenced the effectiveness of the conditions.
Content from External Source
test.JPG
 
Last edited:
The three different approaches:

In the rational condition, the text tackled the claims made in the first recording in a logically plausible manner, using numbers to support the objections, and pointing out the discrepancy between high influence and concealment. This speech pointed out the logical flaws of the first speech and corrected it with in-depth arguments regarding the link between the beliefs' objects and attributes. The goal of this condition was to emphasize the logical inconsistencies and to create a more complex and coherent relationship between the objects of the belief and the attributes.

In the ridiculing condition, the script addressed the same logical flaws, but reasoned against them differently: instead of focusing on certain details, it derided the logical inconsistencies and concentrated on those who believe in the CTs, picturing them as evidently ridiculous (e.g., mentioning the believers of Lizard Men). This text intended to increase the distance between the respondents' self and those who believe in CTs.

The empathetic condition contested the original text's claim in a different manner: instead of focusing on content or those who believe in the content, it placed the objects of the CTs in the center, and compassionately called attention to the dangers of demonizing and scapegoating, while also pointing out the human character of the CT objects (i.e., Jews face similar conspiracy theories and persecution nowadays that the Early Christians faced). This condition intended to reduce the distance between the respondent and the objects of CTs and to raise empathy toward these groups.
Content from External Source
Ridiculing works when a person is not fully into the theory. People prefer not to be part of a group they know is considered ridiculous.

But I think it important here is that what is really working (in both the "rational" and "ridiculing") is pointing out the logical flaws and inconsistencies in the conspiracy theories (i.e. debunking). Ridicule can work for some people, but it can also backfire, and there's no suggestion that it's necessary.

I suspect, and I obviously have little evidence either way, that you are right with the ridiculing theory when you point out it can backfire - and the target simply "doubles down"

but often when you are activity engaged in an online discussion with a person who is committed enough to a viewpoint to actually post on it, they are already quite along way down the road anyway - and would hence meet your "fully into the theory" condition

I think the ridiculing approach is probably much more effective on the "lurkers"

when rebutting/debunking some well worn out claim or meme on a blog, in a way that can be considered "ridicule" I often point out I am responding to the "lurkers" and not really the original poster, it is a tacit recognition of the above
 
I think the ridiculing approach is probably much more effective on the "lurkers"
the other problem with ridicule is you never really know if the person or lurker has actually changed their views or if they just say they have. Basically it's "thought-shaming". (which in some situations is definitely called for but i dont think it actually changes anyone's mind.).

I think most of us have experienced situations where, depending on the crowd present, we either dont bring up or dont engage if someone else brings a topic up in which your input will cause 'negativity' either in the form of argument, hurt feelings or [for those sensitive to such things] ridicule to themselves.

And basically, if the goal of ridicule is to change thought patterns, it is bullying sensitive people into changing their behavior/thoughts. Which is counter productive in every way i can think of, especially in a type of CT described in this experiment. I think hurting a budding cter's (or troll's) self esteem would only exacerbate their feelings of powerlessness, which in turn would make them more paranoid and 'cranky' in the long run.

Another study found that human subjects who had been mistreated by their peers had abnormalities in the corpus callosum (involved in visual processing and memory). Their neurons were shown to have less myelin (a coating that speeds up communication between the cells). In an organ where milliseconds matter, cell communication is crucial, and these results indicate that the brains of bully victims lack normal brain cell activity.
https://nobullying.com/mocking/
Content from External Source
 
Last edited:
the other problem with ridicule is you never really know if the person or lurker has actually changed their views or if they just say they have. Basically it's "thought-shaming". (which in some situations is definitely called for but i dont think it actually changes anyone's mind.).

I think most of us have experienced situations where, depending on the crowd present, we either dont bring up or dont engage if someone else brings a topic up in which your input will cause 'negativity' either in the form of argument, hurt feelings or [for those sensitive to such things] ridicule to themselves.

And basically, if the goal of ridicule is to change thought patterns, it is bullying sensitive people into changing their behavior/thoughts. Which is counter productive in every way i can think of, especially in a type of CT described in this experiment. I think hurting a budding cter's (or troll's) self esteem would only exacerbate their feelings of powerlessness, which in turn would make them more paranoid and 'cranky' in the long run.

Another study found that human subjects who had been mistreated by their peers had abnormalities in the corpus callosum (involved in visual processing and memory). Their neurons were shown to have less myelin (a coating that speeds up communication between the cells). In an organ where milliseconds matter, cell communication is crucial, and these results indicate that the brains of bully victims lack normal brain cell activity.
https://nobullying.com/mocking/
Content from External Source


yes to a large extent I agree, and would not use it when there are two valid(ish) opinions, as in politics, economics or social policy (obviously mine are right!!! - but when they conflict with my wife's, she is :))

as it can come over a bullying and "thought police-ish"

I only use it when people are peddling absolute cr4p, cr4p that is demonstrably cr4p given a moments actual thought
 
In Hungary, the chemtrail theory became known about 4-5 years ago. In the beginning, it was popular on social media, and about 300 people attended a public protest. Then there were a number of smaller protests as well. But a group making fun of the chemtrail believers also formed at the same time, and they always showed up at the chemtrail protests in much larger numbers than the chemmies. Also, a number of articles appeared on the most popular news sites, also ridiculing the chemtrail theory. By now, it has become so embarrassing to be a chemtrail believer in Hungary that the believers are limited to a few small and mostly inactive social media groups with only a handful of hardcore believers. There hasn't been a single protest in several years now because nobody is interested. I think the knowledge that this is a ridiculous thing prevents many people from becoming a chemtrail believer.
 
i, personally think ___-shaming works to some extent. But it is frowned upon in my country anyway. There would very very likely be alot less knocked up teenagers if shaming was reintroduced en masse into the culture.

But pretty sure it leaves lasting scars for those who can't live up to the expectations for whatever reasons.

This artcle linked below is somewhat interesting since it applies to the specfic experiment in this study. Unfortunately i know nothing about Hungary or it's politics ergo not sure what it all means. (i also cant read the graphics in the survey) @skephu have you seen this?

June 2016 Although in Publicus Intézet’s assessment, the typical paranoid who believes in conspiracy theories is someone with a low level of educational attainment who votes for either Jobbik or Fidesz, paranoid impulses are widespread in Hungarian society.

....

The above group thinks that Hungary is being run by domestic forces, just not the government. There is a second group that accuses foreigners of interference in Hungary’s internal affairs. Thirteen percent are convinced that the country is actually run by international financial circles (13%) which may be a code name for Jewish financiers and businessmen. Six percent believe that the strings are in the hands of the European Union while 4% blame the United States. Specific references to Jews were low (2%).

....
Finally, on another level, 25% of the population believe in the deliberate spraying of people with poisonous materials (chemtrails).
http://hungarianspectrum.org/2016/06/19/conspiracy-theories-in-hungary/
Content from External Source
 
Last edited:
Finally, on another level, 25% of the population believe in the deliberate spraying of people with poisonous materials (chemtrails).
Yes, I read that. It was quite surprising considering the fact that chemmies have practically zero presence on social media now. Anyway, this 25% does not mean active believers. People were asked to indicate whether they would rather agree or disagree with a number of statements including "people are deliberately sprayed with poisons from airplanes". Apparently 25% answered that they would rather agree, but that doesn't mean they actively pursue that belief, it's just an opinion on a statement that many of them may have heard for the first time. There are no organized believers.
 
I think the knowledge that this is a ridiculous thing prevents many people from becoming a chemtrail believer.

It probably does. There might be a cultural factor here, as well as other confounding factors - the study was of Hungarian people, average age 46, above average education, who trusted the scientific establishment enough to let them test them.

I think the most useful takeaway here is is that debunking works. In both the "rational" and "ridicule" methods what they did was point out the logical and factual inconsistencies of the CT. In the "rational" method they pointed out the logical flaw, and then provided a more detailed explanation of events, in "ridicule" they pointed out the logical flaws and then emphasized how silly it was to believe a theory with such flaws.


In the rational condition, the text tackled the claims made in the first recording in a logically plausible manner, using numbers to support the objections, and pointing out the discrepancy between high influence and concealment. This speech pointed out the logical flaws of the first speech and corrected it with in-depth arguments regarding the link between the beliefs' objects and attributes. The goal of this condition was to emphasize the logical inconsistencies and to create a more complex and coherent relationship between the objects of the belief and the attributes.

In the ridiculing condition, the script addressed the same logical flaws, but reasoned against them differently: instead of focusing on certain details, it derided the logical inconsistencies and concentrated on those who believe in the CTs, picturing them as evidently ridiculous (e.g., mentioning the believers of Lizard Men). This text intended to increase the distance between the respondents' self and those who believe in CTs.
Content from External Source
The control group listened to the weather forecast. A much more useful control condition here would be simply to do what is common between the "rational" and "ridicule" - i.e. address the same logical flaws (and then listen to the weather forecast). It's possible that the simple addressing of the flaws is all that is needed to get this immediate effect.

And there's a problem with "immediate effect". This study measured only what happens immediate after hearing the rebuttal speeches. It says nothing of what happened a week later.
 
I've attached the "speeches" to the OP, as it was a little fiddly to find on the web site:
https://www.metabunk.org/attachment...nality-and-ridiculing-data-sheet-2-pdf.23403/

Opening paragraphs of the initial argument and the two relevant rebuttal arguments:

The operation of the world economy

In the second half of the 19th century, the frequently recurring crises were induced by the
hidden international financial cartel and the interest-group that controlled the finances from
the shadows. Hungary, cornered into the trap of debt, became the province of the only
superpower, the global financial empire. The financial technocrats of the one-party state, the
current servants of the global financial empire, set up and operate the financial pumps for the
global elite. As a result, the Hungarian national revenue is flowing out of the country.

One of the main reasons behind the deficit of the Hungarian budget and the external trade is
that the EU is continuously sacking Hungary instead of supporting it. The EU is the institute
of the organized private power for the cheap and voluntary elimination of the nation-states
standing in the way of the super-bankers. The EU became a redundant and overbureaucratized
water-head. The advantages—the freer movement of manpower and products,
the reduction of duty, closer cooperation among the states of Europe—could have been
achieved more cheaply and more effectively without it. We should not have given up our
national sovereignty, autonomy, the separate Hungarian jurisdiction. The EU is unnatural,
because it is not a community like a nation or a family which is essential for the biological
and societal reproduction. These two families are the necessary requisites of life. However,
the EU is a parasitic formation without any function.
Content from External Source

Rational condition

The text is not logical and consistent on multiple occasions. First of all, the framing regarding
the European Union is factually wrong: Hungary is one of the main beneficiaries of the EU
membership. In 2013, the EU balance of Hungary was 4956.9 million euros that is Hungary
received much more funds from the EU than the amount it deposited. After Poland and
Greece, Hungary had the best deal nominally. When the balance is measured in proportion to
the Gross National Income (GNI), the support of our country is even more outstanding. In
2013, the balance added up to 5.33% of the GNI; it is the highest among the 28 member
states. If we logically think through that the EU is giving almost 5 billion euros each year to
Hungary for development, it is hard to conceive of this as the “cheap and voluntary”
elimination of a nation-state.

No direct or indirect proof supports that a secret society is controlling the world or Hungary.
The most important political players of the world are those world-powers whose operations
are public (for example, the United States of America). The economic power is transparent as
well, the world’s largest banks, enterprises are present on the stock market, their operations
are overseen, their accounting is public, and anybody can acquire public information about
their management. The invisible players behind the scenes cannot even exert their power. To
have influence, it is necessary to have publicity, the world’s most important political and
economic players are right before our eyes, we just have to notice them.
Content from External Source

Ridiculing condition

The fight against the “global financial empire” and other invisible enemies is the hobby and
craze of conspiracy theory believers. The important thing is to always have an evil with
whom people can be scared, like the bogeyman crawling out from under the children’s bed. It
is the best to choose a suspicious group with a bad reputation as enemy: secret societies, the
House of Rotschilds, the illuminati, the Cabalists, international financial capital, Jews, etc.
There are people who, following this train of thought, think that Lizard people want to take
control over us, and for example the American presidents are disguised Lizard people in
reality. Believable, right? The Facebook page of the most famous Lizard people believer,
David Icke, is followed by half a million people. According to a research, 2% of the
Americans, more than 6 million people, believe in the Lizard people theory. Obviously, it is
easier to scare with a bogeyman than to think logically. It has also been proven by research
that logical thinking is not a strength of the conspiracy believers. According to a British
study, people believing Osama Bin Landen to still be alive, despite the official version, also
believe in that he was already dead when American soldiers found him. He lives and dies at
the same time. Believable, right?
Content from External Source

I don't think they exactly address the same logical flaws.

Also in the "ridicule" speech, it's important that they don't really ridicule the person. They even say you can use CT speech to pick up chicks, just really it's kind of silly.
 
Zero presence on social media? Really? I see them all over.
Your online experience may vary. There are hardly two alike anymore. http://www.business2community.com/b...ats-big-deal-social-media-algorithms-01567174
But there’s another, possibly darker side to social media algorithms. Isn’t “relevancy” a bit subjective? Once the door is opened on a subjective social media experience, a bevy of new questions and challenges are introduced. Does a “like” actually mean I like something? Not always. Do I need to be shown something immediately just because a few high school acquaintances are engaging with it? Please god no. For executives and engineers at the big social platforms, their job becomes toeing that precarious line – between the dullness of chronology and the murky subjectivity of “relevancy.”
Content from External Source
people now see, mostly, what they wanted to see, online.
 
Just my 2 cents. I'm quite firmly on the opposite side of most ideas around here so take it for what it's worth...
I've spent the better part of the past 20 years seeking out all the truth I can find while at the same time persuading everyone around me to treat even the simplest and commonly accepted facts with complete skepticism. :) It's quite empowering tbh. It's my belief that the most effective learning is in the teaching and the teaching is most easily done with a subject wanting to do the lifting. So this means quite a lot of focus on right brain and intuition in addition to rational argument and ridicule... You "lets stop all the CTs/ debunker" guys get so hung up on the pure left brain stuff its kind of amusing really. :p
 
You "lets stop all the CTs/ debunker" guys get so hung up on the pure left brain stuff its kind of amusing really.
i dont think thats completely accurate for the majority of debunkers. Personally it is much easier to de-program CTers :) using both right and left brain methods.
Metabunk though has a specific focus. That is examining 'specific claims of evidence'.. rationally.. Minutely taking the bunk parts out of full theories the whole brain is processing. MB isnt about discussing where and why your general thinking processes might be flawed.
So you have to bear that in mind when 'analyzing' MB debunkers ON MB.

ps ridicule is right brain.

So.. do you think ridicule can be effective in changing minds?
 
Last edited:
It's my belief that the most effective learning is in the teaching and the teaching is most easily done with a subject wanting to do the lifting. So this means quite a lot of focus on right brain and intuition
Why do you call that "lifting"? Intution and "right-brain" stuff requires no effort.
 
An interesting article came across my news feed "Scientists claim they've developed a psychological 'vaccine' against fake news: Inoculating against misinformation."


Scientists say they’ve developed a psychological 'vaccine' for fake news, which can be used to inoculate the public against misinformation.

"Misinformation can be sticky, spreading and replicating like a virus," said lead researcher, Sander van der Linden from the University of Cambridge. "We wanted to see if we could find a 'vaccine' by pre-emptively exposing people to a small amount of the type of misinformation they might experience. A warning that helps preserve the facts."

The team amassed more than 2,000 national representative US residents, from various ages, genders, political leanings, and education levels.

They began by examining their thoughts on climate change - a politically charged issue often compromised with misinformation, despite having a solid grounding in facts and research.

The researchers presented the group with a number of scientifically sound climate change facts in the form of statements, such as "97 percent of climate scientists have concluded that human-caused global warming is happening".

They also presented the group with misinformation taken from an Oregon petition that is known to be fraudulent, which states that "31,000 American scientists state that there is no evidence that human-released CO2 causes climate change".

After hearing these statements, the participants were asked to estimate what they thought the current level of scientific agreement was on climate change to see how hearing different information might sway their personal opinions.

The team found that those who were shown the accurate information - in the form of a pie chart - described the scientific consensus to be "very high", and those shown only the misinformation reported it "very low", which makes sense, because these are the only facts they were going by.

But a rather disconcerting discovery came when the team showed the factually correct pie chart, followed by the misinformation to the same group.

It turned out that seeing both statements effectively cancelled each other out, putting people back to a state of indecision.

"It's uncomfortable to think that misinformation is so potent in our society," said van der Linden.
Content from External Source
Link to full study
 
It is difficult, but I have come to the conclusion that in some cases, there is no option but to firmly state that you will not engage in any further discussions on the person's pet topics. Maybe this applies more, the closer you actually are to the person. I'll lay out my reasoning:

  • If you engage them in debate, their only takeaway will be that their "ideas" are worth debating.
  • If you instead try to be polite and just endure listening to them, this is taken similarly -- they will take it as indicating their ideas are worth being heard.
  • If you ignore them with no explanation, this will surely be taken by them as their having won a "victory".

Which leaves only one option: to ignore them, but only after having made a clear and firm statement that this is what you are going to do. Naturally, the "victory" angle will still be in play, but it will not be left as a loose end. And as possibly concerns your own pride, it makes no difference, since you know very well that "victory" would have been declared just the same, in any case.

Beyond this, a phrase recently popped into my head, which I found to be perhaps interesting: "They trumpet their own ignorance, precisely in order that they might wear the resulting ridicule as a badge of courage."

The context is that I was contemplating how someone could be so blind to reason that it becomes nearly impossible to believe that they aren't actually aware on some level; the questions being, on what level, how, and why? When a person gets in with a group of like-minded people, there exist any number of ways to make an impression on, or raise status within the group. One way (especially with religion-tinged groups) would be in showing how much the person is "suffering" for the cause, and in this case, the more outrageous the claims, the more "suffering" produced, and the more status may be gained within the group.

The mark of such a group might be seen in whether it involves a certain camaraderie based both on is members subscribing to a particular core set of beliefs, and on the "persecution" of its members for their remaining true to those beliefs. For such a group, the more likely to provoke ridicule the beliefs are, the better.

For such a person, your responses will be all they are after, and I'd predict that they'll move to ever more inexplicable extremes to solicit more of them from you. Because it is not actually the debate they are after, but rather only the fact of being opposed; because in a twisted way, to be ridiculed is precisely the thing they are after.

And when this is the case, I think it presents a nearly perfect trap for any who would be inclined to engage them, since it might take someone a good long while to perceive that the person on the other side is not interested in the actual debate, at all.
 
A caveat here: it may also depend on what we're looking for.

Are we looking for signs that we've gotten through to them?

Are we looking for our own triumph and victory?

There's always the possibility that people go away and think about what we've said. That their minds will change later on, and that we may never see it. That this may take a day or a year or a lifetime, but we'll have played a part in it, however small.

It's happened to me, on both sides of the equation.
 
Via ScienceAlert on Facebook (Interesting comments) I came across an article published in The Conversation: Why people believe in conspiracy theories – and how to change their minds. I recommend reading the complete article.

Excerpts:

Why People believe:

Author:
Mark Lorch
Professor of Science Communication and Chemistry, University of Hull
It seems our need for structure and our pattern recognition skill can be rather overactive, causing a tendency to spot patterns – like constellations, clouds that looks like dogs and vaccines causing autism – where in fact there are none.
Content from External Source

Peer pressure
Another reason we are so keen to believe in conspiracy theories is that we are social animals and our status in that society is much more important (from an evolutionary standpoint) than being right. Consequently we constantly compare our actions and beliefs to those of our peers, and then alter them to fit in. This means that if our social group believes something, we are more likely to follow the herd.

This effect of social influence on behaviour was nicely demonstrated back in 1961 by the street corner experiment, conducted by the US social psychologist Stanley Milgram (better known for his work on obedience to authority figures) and colleagues. The experiment was simple (and fun) enough for you to replicate. Just pick a busy street corner and stare at the sky for 60 seconds.

Most likely very few folks will stop and check what you are looking at – in this situation Milgram found that about 4% of the passersby joined in. Now get some friends to join you with your lofty observations. As the group grows, more and more strangers will stop and stare aloft. By the time the group has grown to 15 sky gazers, about 40% of the by-passers will have stopped and craned their necks along with you. You have almost certainly seen the same effect in action at markets where you find yourself drawn to the stand with the crowd around it.

The principle applies just as powerfully to ideas. If more people believe a piece of information, then we are more likely to accept it as true. And so if, via our social group, we are overly exposed to a particular idea then it becomes embedded in our world view. In short social proof is a much more effective persuasion technique than purely evidence-based proof, which is of course why this sort of proof is so popular in advertising (“80% of mums agree”).

Social proof is just one of a host of logical fallacies that also cause us to overlook evidence. A related issue is the ever-present confirmation bias, that tendency for folks to seek out and believe the data that supports their views while discounting the stuff that doesn’t.
Content from External Source

Myth-busting mishaps
You might be tempted to take a lead from popular media by tackling misconceptions and conspiracy theories via the myth-busting approach. Naming the myth alongside the reality seems like a good way to compare the fact and falsehoods side by side so that the truth will emerge. But once again this turns out to be a bad approach, it appears to elicit something that has come to be known as the backfire effect, whereby the myth ends up becoming more memorable than the fact.

One of the most striking examples of this was seen in a study evaluating a “Myths and Facts” flyer about flu vaccines. Immediately after reading the flyer, participants accurately remembered the facts as facts and the myths as myths. But just 30 minutes later this had been completely turned on its head, with the myths being much more likely to be remembered as “facts”.
Content from External Source
To make matters worse, presenting corrective information to a group with firmly held beliefs can actually strengthen their view, despite the new information undermining it. New evidence creates inconsistencies in our beliefs and an associated emotional discomfort. But instead of modifying our belief we tend to invoke self-justification and even stronger dislike of opposing theories, which can make us more entrenched in our views.
Content from External Source
How to address the issue:


So if you can’t rely on the facts how do you get people to bin their conspiracy theories or other irrational ideas?

Scientific literacy will probably help in the long run. By this I don’t mean a familiarity with scientific facts, figures and techniques. Instead what is needed is literacy in the scientific method, such as analytical thinking. And indeed studies show that dismissing conspiracy theories is associated with more analytic thinking. Most people will never do science, but we do come across it and use it on a daily basis and so citizens need the skills to critically assess scientific claims.
Content from External Source

Meanwhile, to avoid the backfire effect, ignore the myths. Don’t even mention or acknowledge them. Just make the key points: vaccines are safe and reduce the chances of getting flu by between 50% and 60%, full stop. Don’t mention the misconceptions, as they tend to be better remembered.

Also, don’t get the opponents gander up by challenging their worldview. Instead offer explanations that chime with their preexisting beliefs. For example, conservative climate-change deniers are much more likely to shift their views if they are also presented with the pro-environment business opportunities.

One more suggestion. Use stories to make your point. People engage with narratives much more strongly than with argumentative or descriptive dialogues. Stories link cause and effect making the conclusions that you want to present seem almost inevitable.

All of this is not to say that the facts and a scientific consensus aren’t important. They are critically so. But an an awareness of the flaws in our thinking allows you to present your point in a far more convincing fashion.

It is vital that we challenge dogma, but instead of linking unconnected dots and coming up with a conspiracy theory we need to demand the evidence from decision makers. Ask for the data that might support a belief and hunt for the information that tests it. Part of that process means recognising our own biased instincts, limitations and logical fallacies.
Content from External Source
 
Last edited by a moderator:
Perhaps one of the more effective rational approaches in changing conspiracy beliefs and in which to look at life in which there is very little we can be absolutely certain about (IMHO).

From the article linked to below:

No matter how smart or educated you are, what you don’t know far surpasses anything you may know. Socrates taught us the virtue of recognizing our limitations. Wisdom, he said, requires possessing a type of humility manifested in an awareness of one’s own ignorance. Since then, the value of being aware of our ignorance has been a recurring theme in Western thought: René Descartes said it’s necessary to doubt all things to build a solid foundation for science; and Ludwig Wittgenstein, reflecting on the limits of language, said that “the difficulty in philosophy is to say no more than we know.”
Content from External Source

A worthwhile read in the NYTimes: Knowledge, Ignorance and Climate Change

Excerpts:


Knowledge, Ignorance and Climate Change
Philosophers have been talking about skepticism for a long time. Some of those insights can shed light on our public discourse regarding climate change.

By N. Ángel Pinillos

Dr. Pinillos is a professor of philosophy at Arizona State University.
Nov. 26, 2018
Content from External Source

As a philosopher, I have nothing to add to the scientific evidence of global warming, but I can tell you how it’s possible to get ourselves to sincerely doubt things, despite abundant evidence to the contrary. I also have suggestions about how to fix this.
Content from External Source

Philosophers have been studying skeptical pressure intensely for the past 50 years. Although there is no consensus about how it arises, a promising idea defended by the philosopher David Lewis is that skeptical pressure cases often involve focusing on the possibility of error. Once we start worrying and ruminating about this possibility, no matter how far-fetched, something in our brains causes us to doubt. The philosopher Jennifer Nagel aptly calls this type of effect “epistemic anxiety.”
Content from External Source


One way to counter the effects of skepticism is to stop talking about “knowledge” and switch to talking about probabilities. Instead of saying that you don’t know some claim, try to estimate the probability that it is true. As hedge fund managers, economists, policy researchers, doctors and bookmakers have long been aware, the way to make decisions while managing risk is through probabilities. Once we switch to this perspective, claims to “not know,” like those made by Trump, lose their force and we are pushed to think more carefully about the existing data and engage in cost-benefit analyses.

Interestingly, people in the grips of skepticism are often still willing to accept the objective probabilities. Think about the lottery case again. Although you find it hard to say you know the shopper will lose the lottery, you readily agree that it is still very probable that he will lose. What this suggests is that even climate skeptics could budge on their esteemed likelihood of climate change without renouncing their initial skepticism. It’s easy to say you don’t know, but it’s harder to commit to an actual low probability estimate in the face of overwhelming contrary evidence.
Content from External Source
 
Last edited:
"You can ridicule a plan by explaining it's low feasibility, or you can ridicule a plan by insulting the person who invented the plan. But you can't ridicule a successful plan, even if it was invented by an idiot."

=quote, unknown=
;)
 
Last edited:
Back
Top