Long-debunked myths about health and medicine are widely — and stubbornly — held. Some can lead to harm, such as the false belief that childhood vaccinations can cause autism. Others usually just waste people’s time and money, such as the mistaken idea that taking vitamins or other supplements provides “added protection” against disease or that a “colon cleansing” eliminates toxins from the body.
But how do you persuade people that such beliefs are based on misinformation? That’s a huge challenge.
In his latest NeuroHacks column for the BBC Future website, British psychologist Tom Stafford explains what two researchers, Stephen Lewandowsky and John Cook, discovered on this topic a few years ago when they were exploring how to counter misinformation on climate change.
Writes Stafford (with British spellings):
The first thing their review turned up is the importance of “backfire effects” — when telling people that they are wrong only strengthens their belief. In one experiment, for example, researchers gave people newspaper corrections that contradicted their views and politics, on topics ranging from tax reform to the existence of weapons of mass destruction. The corrections were not only ignored — they entrenched people’s pre-existing positions.
Backfire effects pick up strength when you have no particular reason to trust the person you are talking to. This perhaps explains why climate sceptics with more scientific education tend to be the most sceptical that humans are causing global warming.
The irony is that understanding backfire effects requires that we debunk a false understanding of our own. Too often, argue Lewandowsky and Cook, communicators assume a ‘deficit model’ in their interactions with the misinformed. This is the idea that we have the right information, and all we need to do to make people believe is to somehow “fill in” the deficit in other people’s understanding. Just telling people the evidence for the truth will be enough to replace their false beliefs. Beliefs don’t work like that.
What does work? How can people who are scientifically misinformed be persuaded that their beliefs are wrong? Research has shown, says Stafford, that the most important thing you can do is to offer a plausible, alternative explanation — one that will take the place of the myth.
In their “Debunker’s Handbook,” Lewandowsky and Cook offer the following tips (again, with British spellings) for how to do this. These tips are for written efforts at debunking misinformation, but, with a little adaptation, they can be used in conversations as well.
Core facts — a refutation should emphasise the facts, not the myth;
Explicit warnings — before any mention of a myth, text or visual clues should warn that the upcoming information is false;
Alternative explanation — any gaps left by the debunking need to be filled. This may be achieved by providing an alternative causal explanation for why the myth is wrong and, optionally, why the misinformers promoted the myth in the first place;
Graphics — core facts should be displayed graphically, if possible.
Be forewarned, however: Attempting to persuade someone that one of his or her long-held beliefs is false is not risk-free. “If you try and debunk a myth,” writes Stafford, “you may end up reinforcing that belief, strengthening the misinformation in people’s mind without making the correct information take hold.”
And don’t forget to take a close look at some of your own long-held notions of scientific truth, he adds. “This debunking advice is also worth bearing in mind if you find yourself clinging to your own beliefs in the face of contradictory facts,” Stafford writes. “You can’t be right all of the time, after all.”
You can read Stafford’s Neurohacks column on the BBC Future website. He also writes the very entertaining and informative MindHacks blog. Lewandowsky and Cook’s “Debunker’s Handbook” can be downloaded for free at skepticalscience.com.