When Corrections Fail

There are certain points when debating with someone where you sort-of have to walk away. [...] 'cause you'll just basically end up getting a concussion.  -Geo

We’ve all been there. Someone we know, perhaps a relative, is nattering on about something abjectly false: something so wrong it’s akin to the claim that the Earth is flat. We try to correct the falsehood by presenting empirical, credible evidence for a spherical Earth. To our immense frustration, the friend (or relative) angrily defends flat Earth. It can seem as if presenting more evidence only strengthens their dedication to flat Earth.

Political scientist Brendan Nyhan studied the persistence of false political beliefs, and the results are fascinating. He reported these results in a paper he coauthored with Jason Reifler, When Corrections Fail: The Persistence of Political Misperceptions. If you don't have time to read it, there's a fantastic interview with Nyhan on the Point of Inquiry podcast. Here are the major points.

Misinformed vs. Uninformed

Nyhan and Reifler were targeting misinformation in particular. Misinformation is a falsehood that people Think they know to be true. Misinformed test subjects tended to substitute political ideology for facts. Both liberals and conservatives resisted corrective information that conflicted with their political opinions.

Interestingly, there seems to be a negative correlation between the level of confidence people have in their particular political beliefs and the likelihood that the beliefs are accurate. Elites, who sometimes have strong incentives to misinform the public, add false legitimacy and intensity to political misinformation.

In contrast to misinformation, Nyhan provides an example of simple ignorance of facts (I’m paraphrasing from memory) :   

Someone asks you to name the Chief Justice of the Supreme Court, and you don't know but guess Scalia. When the person says, "No, actually it's Roberts," you are fairly likely to just accept the correction because it's not tied to a particular political ideology.

Backfire Effect

An utterly fascinating discovery from the paper is "backfire effect."

[...] individuals who receive unwelcome information may not simply resist challenges to their views. Instead, they may come to support their original opinion even more strongly.

Although both liberals and conservatives had trouble letting go of misinformation, conservatives were the only ones, in this particular study, who exhibited the backfire effect. 

More Detail

I emailed Nyhan with a few follow-up questions. Note than Nyhan speaks only for himself and not for his coauthor.

> 1.  Who are the elites?  Can you give me a couple of examples?  It seems to me that they hold sway both in originally providing misinformation to the public and later play an important role in publicly discrediting misinformation.  Who else in society might be a credible source of corrective information?

I'm not quite sure what you mean by "Who are the elites?" For the purposes of the article, we used "political elites" to refer to prominent political figures who were featured in the media, highly placed in political institutions and groups, etc. Elites often play a key role in getting political myths into circulation. It's certainly true that elites *could* also play a role in discrediting misinformation, but the problem is that it's usually one side's elites that try to do so, and their messages are often not persuasive to supporters of the other side. Ideally, we would like to have institutions and elites who are highly credible to both sides, but very few exist who are willing to try to arbitrate these sorts of disputes and set the record straight. In particular, the media is often reluctant to fact-check and is frequently accused of bias when it does so.

> 2.  Do you think the media's attempt to provide balanced reporting and presenting "both sides" of issues has the effect of implying false equivalency?  I'm thinking here of the role of the media in failing to discredit Andrew Wakefield and the shenanigans of the anti-vaccine movement.

Yes. I've been highly critical of the notion that the media has to be balanced with regard to the facts -- see in particular All the President's Spin, which is a book I co-wrote when I was co-editor of Spinsanity. A good critique of artificial balance is here. For an experimental study of its potentially harmful effects, see [this site].

> 3. How much motivated reasoning can be attributed to a basic human reaction to cognitive dissonance?  None of us wants to be wrong, but it seems to me that the current political climate prohibits the possibility of changing paradigms based on credible evidence.  If so, how can we change the climate to be more open to critical thinking?

It's a very good question. Cognitive dissonance is one mechanism for motivated reasoning, but there are others -- one possibility is that it's driven by emotional incongruence rather than contradictory cognitions (i.e. I like person X, and you say they're wrong). Changing the climate is difficult. We're working on understanding the conditions under which people will be more open to unwelcome information, but there are no easy answers. The main strategy I've advocated is to "name and shame" elites who promote misleading claims -- I discuss this issue more in an article on misinformation about health care reform and in a couple of interviews [here and here].

> 4. Given the potential harm of the backfire effect, do you recommend attempting to correct misperceptions IRL when you know going in that no amount of evidence will sway the proponent of misinformation?  What do you do in these situations?

Yes, I think it's still crucial to correct misperceptions in order to change the incentives for elites as I mentioned above, but we shouldn't mislead ourselves about the effectiveness of that strategy for the mass public. Corrections are most likely to be effective in dissuading elites from promoting misleading claims or stopping the spread of a myth early in its lifespan, not in changing the minds of people once a myth is circulating widely.

Conclusion

I’m hopeful that by discussing these and other barriers to critical thinking, we can all get a little closer to rationality ourselves and carry that out into the larger culture. I’m grateful to people like Brendan Nyhan who help us find the path to reason.

 

A version of this article was originally published by Nikki Stern at Does This Make Sense.