Wednesday, September 17, 2008

Why It’s Better To Fight Lies With Different Lies

I’m being a bit flip, but the point is that research continues to show me that if you can’t fight lies with “the truth” then its better start telling new lies. Or at least change the subject.

Shankar Vedantam’s Human Behavior column points to another fascinating study about what people think when they are told something isn’t true. Most of the time it doesn’t matter, the effect has already happened.
In experiments conducted by political scientist John Bullock at Yale University, volunteers were given various items of political misinformation from real life. One group of volunteers was shown a transcript of an ad created by NARAL Pro-Choice America that accused John G. Roberts Jr., President Bush's nominee to the Supreme Court at the time, of "supporting violent fringe groups and a convicted clinic bomber."

Bullock then showed volunteers a refutation of the ad by abortion-rights supporters. He also told the volunteers that the advocacy group had withdrawn the ad. Although 56 percent of Democrats had originally disapproved of Roberts before hearing the misinformation, 80 percent of Democrats disapproved of the Supreme Court nominee afterward. Upon hearing the refutation, Democratic disapproval of Roberts dropped only to 72 percent.
Basically if you already were primed to dislike John Roberts, the information had the most effect on you, even after you were told it wasn’t true. If you weren’t primed to dislike him, it had less effect. Vedantam doesn’t mention what about the people who weren’t primed either way, but I would bet it still had some effect, perhaps even a lot, but less than on those who already disliked him.

I can’t find the original study but I can speculate a few reasons why it would work that way. If I'm already in an anti-John Roberts frame of mind, I think hearing “John Roberts supported a convicted clinic bomber” has the effect of reminding me why I don't like him (His extreme positions about women’s rights), even when I find out later that this specific fact isn't true. I remain in a slightly elavated state of John Roberts-hating despite the fact the new cause of the hate is wrong. The new incorrect information merely reminds me of all the old correct information that I already know. (Just to be clear, I'm using pretty broad terms to discuss what are really more subtle emotions and thoughts. But being in a "John Roberts-slightly-elevated state of increased dislike" just doesn't roll off the tongue.)

Another aspect of the study I would like to know more about is how the corrections were presented to the test subjects. The article says the subjects were shown an "ad by abortion-rights supporters." I’m not sure I would trust pro-life group to tell me the sky is blue. It's possible that in this particular study the source of the refutation is the problem, and hence why hearing it didn't change the democrats' feelings about Roberts. However if it was presented as coming from a more neutral source, say from or the Washington Post, they might have found it more trustworthy and had a bigger impact. However the Republicans might not have had the same reaction from the source.

Which leads to the second study Vedantam quotes.
Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration's prewar claims that Iraq had weapons of mass destruction. One group was given a refutation -- the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. Thirty-four percent of conservatives told only about the Bush administration's claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent of conservatives who heard both claim and refutation thought that Iraq really did have the weapons. The refutation, in other words, made them misinformation worse.

In a paper approaching publication, Nyhan, a PhD student at Duke University, and Reifler, at Georgia State University, suggest that Republicans might be especially prone to the backfire effect because conservatives may have more rigid views than liberals: Upon hearing a refutation, conservatives might "argue back" against the refutation in their minds, thereby strengthening their belief in the misinformation. Nyhan and Reifler did not see the same "backfire effect" when liberals were given misinformation and a refutation about the Bush administration's stance on stem cell research.
Again, I’m wondering if the source of the refutation matters? Republicans are more likely to distrust the so-called mainstream media outlets, your Washington Post, New York Times, NBC, CBS, ABC, 60 Minutes, Newsweek, Time, etc, etc, etc. But I’m wondering if they heard that The National Journal refuted Bush administration's prewar claims that Iraq had weapons of mass destruction would that change the results? Possibly not:
A similar "backfire effect" also influenced conservatives told about Bush administration assertions that tax cuts increase federal revenue. One group was offered a refutation by prominent economists that included current and former Bush administration officials. About 35 percent of conservatives told about the Bush claim believed it; 67 percent of those provided with both assertion and refutation believed that tax cuts increase revenue.
This is why whenever I read about people hearing that Sarah Palin is telling lies I know it won't faze Republican voters. They think it’s the media who are the liars.

But the other part of charge-countercharge that these studies can’t duplicate is that even when we hear a refutation, we can often find a contradictory opinion. Especially if it supports a belief we already want to believe. Don’t like, don’t worry. Someone on Newsmax already explained why “the media” is just spinning lies.

I would rather live in a world where untruths can be countered by facts. But that doesn’t seem to be the world we live in. So rather than fighting fire with sand its probably better to fight with fire. Cause it doesn’t matter how much sand you put on some lies, it never puts them out.

Cross-posted at NewsCat


habladora said...

It is horrifying to think that people are incapable of reevaluating false beliefs when confronted with facts that disprove the false information. What could the advantage be to clinging to false information?

Oh, and did you know that, according to top aids, John McCain drinks the blood of newborn puppies just before bed each night?

(The above information regarding John McCain's taste for puppy blood has been refuted. I withdraw any statements I've made asserting that John McCain drinks the blood of cute little innocent puppies.)

Amelia said...

Wow, excellent post.

haha! I totally believe Habladora's claim about McCain, despite the fact that she said it had been refuted. ;) ;)

I also have to wonder if perhaps the response to misinformation/refutations is different when it comes to politics than it would be if the information were, say, something being taught in school (like math, or history, etc.).


Casmall said...

Great post!

I've always considered this kind of reasoning as a sort of arrested development and it seems to be very much how teenagers think.

National Women's Editorial Forum said...

Actually Amelia, I bet that all knowledge works this way. Medical news, historical knowledge, etc.

I think what might make political information different is there that there is often a rebutal available. But how often have you read some historical analysis that was wrong and didn't find out until years later that the book you read completely miscatorgorized events?

And did you hear that Sarah Palin at first demanded that Bristol get an abortion?

Kris-Stella said...

What a great post! And habladora's comment is priceless! :D

lindabeth said...

This is a really great post and says very well what I have been talking about with friends lately.

On top of this, of course, we have the segment of sound-bite voters who never even hear the refutation!

How depressing!