notes: the debunking handbook

the debunking handbook by john cook and stephan lewandowsky

How does one avoid causing the Familiarity Backfire Effect? Ideally, avoid mentioning the myth altogether while correcting it. When seeking to counter misinformation, the best approach is to focus on the facts you wish to communicate.

Not mentioning the myth is sometimes not a practical option. In this case, the emphasis of the debunking should be on the facts. The often-seen technique of headlining your debunking with the myth in big, bold letters is the last thing you want to do. Instead, communicate your core fact in the headline. Your debunking should begin with emphasis on the facts, not the myth. Your goal is to increase people’s familiarity with the facts.

One principle that science communicators often fail to follow is making their content easy to process. That means easy to read, easy to understand and succinct. Information that is easy to process is more likely to be accepted as true.  Merely enhancing the colour contrast of a printed font so it is easier to read, for example, can increase people’s acceptance of the truth of a statement.

Common wisdom is that the more counter-arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Debunks that offered three arguments, for example, are more successful in reducing the influence of misinformation, compared to debunks that offered twelve arguments which ended up reinforcing the myth.

The Overkill Backfire Effect occurs because processing many arguments takes more  effort than just considering a few. A simple myth is more cognitively attractive than an over-complicated correction.  The solution is to keep your content lean, mean and easy  to read. Making your content easy to process means using every tool available. Use simple language, short sentences, subheadings and paragraphs. Avoid dramatic  language and derogatory comments that alienate people. Stick to the facts.

End on a strong and simple message that people will remember and tweet to their friends, such as “97 out of 100 climate scientists agree that humans are causing global warning”; or “Study shows that MMR vaccines are safe.” Use graphics wherever possible to illustrate your points.

Disconfirmation Bias, the flipside of Confirmation Bias. This is where people spend significantly more time and thought actively  arguing against opposing arguments.

The process of bringing to the fore supporting facts resulted in strengthening people’s erroneous belief.

the Worldview Backfire Effect is strongest among those already fixed in their views. You therefore stand a greater chance of correcting misinformation among those not as firmly decided about hot-button issues. This suggests that outreaches should be directed towards the undecided majority rather than the unswayable minority.

Second, messages can be presented in ways that reduce the usual psychological resistance. For example, when worldview-threatening messages are coupled with so-called self-affirmation, people become more balanced in considering pro and con information.

Self-affirmation can be achieved by asking people to write a few sentences about a time when they felt good about themselves because they acted on a value that was important to them. People then become more receptive to messages that otherwise might threaten their worldviews, compared to people who received no self-affirmation. Interestingly, the “self-affirmation effect” is strongest among those whose ideology was central to their sense of self-worth.

Another way in which information can be made more acceptable is by “framing” it in a way that is less threatening to a person’s worldview. For example, Republicans are far more likely to accept an otherwise identical charge as a “carbon offset” than as a “tax”, whereas the wording has little effect on Democrats or Independents—because their values are not challenged by the word “tax”

When people hear misinformation, they build a mental model, with the myth providing an explanation. When the myth is debunked, a gap is left in their mental model. To deal with this dilemma, people prefer an incorrect model over an incomplete model. In the absence of a better explanation, they opt for the wrong explanation.

The most effective way to reduce the effect of misinformation is to provide an alternative explanation for the events covered by the misinformation.

For the alternative to be accepted, it must be plausible and explain all observed features of the event.  When you debunk a myth, you create a gap in the person’s mind. To be effective, your debunking must fill that gap.

One gap that may require filling is explaining why the myth is wrong. This can be achieved by exposing the rhetorical techniques used to misinform.  The techniques include cherry picking, conspiracy theories and fake experts.

Another alternative narrative might be to explain why the misinformer promoted the myth. Arousing suspicion of the source of misinformation has been shown to further reduce the influence of misinformation.

Another key element to effective rebuttal is using an explicit warning (“watch out, you might be misled”) before mentioning the myth. Experimentation with different rebuttal structures found the most effective combination included an alternative explanation and an explicit warning.

Graphics are also an important part of the debunker’s toolbox and are significantly more effective than text in reducing misconceptions. When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation. Graphics provide more clarity and less opportunity for misinterpretation.

If your content can be expressed visually, always opt for a graphic in your debunking.

Advertisements

About jeanne

artist, grandma, alien

Posted on December 3, 2011, in research. Bookmark the permalink. Leave a comment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: