Vox article on Debunking, avoiding The Backfire Effect

Pete Tar

Senior Member.
Some good advice we should keep in mind.
Lewandowsky: The moment you get into situations that are emotionally charged, that are political, that are things that affect people’s fundamental beliefs — then you've got a serious problem. Because what might happen is that they’re going to dig in their heels and become more convinced of the information that is actually false. There are so-called backfire effects that can occur, and then the initial belief becomes more entrenched.
It’s very difficult. A lot of this stuff is about cultural identity and people's worldviews. And you've got to take that into account and gently nudge people out of their beliefs. But it’s a difficult process.

There’s some evidence that you can avoid that if you ask people to tell us [about] an occasion when you felt really good about your fundamental beliefs in free enterprise (or whatever is important to the person in question). Then they become more receptive to a corrective message. And the reason is that it’s less threatening in that context. Basically, I make myself feel good about the way I view the world, and then I can handle that because it’s not threatening my basic worldview.

Now, the trick appears to be that you’ve got to get people the opportunity to deal with information in great depth. If you have a situation like a classroom where people are forced to sit down and pay attention, that’s when more information is helpful. There's a lot of evidence of this in educational psychology.

Now the problem is in a sort of casual situation, people listening to the radio or having a superficial conversation — that's where the information deficit model doesn’t apply. And superficially just throwing information at people probably will make them tune out.
There’s a couple of things I can suggest. The first thing is to make people affirm their beliefs. Affirm that they’re not idiots, that they're not dumb, that they’re not crazy — that they don't feel attacked. And then try to present the information in a way that’s less conflicting with [their] worldview.

However, there's plenty of evidence that in a casual context — turning on the TV or whatever — you can dilute the message by putting too much information in it. This whole information-overload issue is more critical in a more casual context. And that's always important.
Content from External Source
It also mentions The Debunking Handbook.