Cognitive Problems

HUMAN COGNITIVE DIFFICULTIES AND
DEALING WITH MISINFORMATION

Concise Recommendations

These four recommendations are a summation of recommendations made in the following text.

Problem:
Continued Influence effect – Despite retractions or corrections, people continue to rely on misinformation
Solution: Alternative Account – Alternative explanation fills gap left by retracting misinformation
Good Practice: Repeated Retraction – Strengthen retraction through repetition (without reinforcing myth)

Problem: Familiarity Backfire Effect – Repeating the myth increases familiarity, reinforcing it
Solution: Emphasis on Facts – Avoid repetition of the myth; reinforce the correct facts instead
Good Practice: Pre-exposure Warning – Warn upfront that misleading information is coming

Problem: Overkill Backfire effect – Simple myths are more cognitively attractive than complicated refutations
Solution: Simple, Brief Rebuttal – Use fewer arguments in refuting the myth, less is more
Good Practice: Foster Healthy Skepticism – Skepticism about information source reduces influence of misinformation

Problem: Worldview Backfire effect – Evidence that threatens worldview can strengthen initially held beliefs
Solution: Affirm Worldview – Frame evidence in worldview affirming manner by endorsing values of audience
Good Practice: Affirm Identity – Self-affirmation of personal values increases receptivity to evidence

Click HERE for a compact PDF print-out of these recommendations

**************************************

The material in this document comes almost entirely from the study cited at the bottom:
Misinformation and Its Correction: Continued Influence and Successful Debiasing. Lewandowsky, Stephan, Ecker, Ullrich K.H., Seifert, Colleen M., Schwarz, Norbert & Cook, John (2012).

Reiterating what we said on our “Human Evolution and Propaganda page, some psychological researchers suggest that “…to comprehend a statement, people must at least temporarily accept it as true. On this view, belief is an inevitable consequence of – or, indeed, precursor to – comprehension.”

Suspension of belief during listening is possible, but requires at least one of the following:

  • A high degree of attention
  • Considerable implausibility of the message
  • High levels of distrust while listening.

The deck is stacked towards acceptance, not rejection, unless other factors suggest the speaker’s untrustworthiness. Going beyond this acceptance default requires additional motivation and cognitive resources.

Assessing the Truth of a Statement: A Listener’s strategies

When evaluating the truth of information, the features we assess are limited:

  • Is this information compatible with other things I believe to be true?
  • Is the information internally coherent, does it form a plausible story?
  • Does it come from a credible source?
  • Do other people believe it?

Is the information compatible with what I believe?
Compatibility with our preexisting beliefs is crucial; information should contain nothing that contradicts our current knowledge (worldview), must be easy to read and process, and “feel right.”

Our worldview includes what we assume to be true. When we hear something new, we automatically – often unconsciously – assess the new in terms of the old. If they are compatible, the new information is accepted and integrated. Once accepted, it becomes highly resistant to change, and the more so the larger the compatible knowledge base is. This resistance derives from two factors:

  • Judgment Perspective: Large body of evidence supporting the worldview is now applied to the new information
  • Cognitive-consistency Perspective: Rejection of the new information would create numerous inconsistencies in the current worldview

When a message is consistent with one’s worldview, it “feels right” – it feels familiar and comprehension is high. When inconsistent, it “feels wrong,” comprehension is reduced, and prompts closer scrutiny of the message. Even superficial characteristics of a presentation may cause it to “feel right” and heighten comprehension, such as:

  • Message is in high color contrast rather than low color contrast
  • Message is in rhyming form rather than non-rhyming form
  • Message in an easy-to-read font can mask misleading questions and statements

Is the story coherent?
A coherent and easy to understand story is both highly compelling and resistant to change.

A story is compelling to the extent that it organizes information without internal contradictions in a way that fits with our assumptions about human motivation and behavior. Good stories are easily remembered, and any gaps in the story are filled in by the listener’s mind. Once a coherent story is formed, it is highly resistant to change:

  • Each element of the story is supported by its good fit with the other elements
  • Altering any element will cause “downstream inconsistencies,” it won’t feel as “right”
  • Coherent stories are easier to understand and remember
  • People use preexisting processing experience when judging a story’s coherence; this favors easy-to-process messages

Is the story from a credible source?
The persuasiveness of a message increases with the communicator’s perceived credibility and expertise. However, even untrustworthy sources are often influential due to:

  • Listener’s insensitivity to contextual cues bearing on the source’s credibility
  • Listeners’ tendency to focus on features of the speaker rather than on features of the situation
  • Biased or paid speakers often treated equally with unbiased or unpaid speakers
  • The gist of the message often more memorable than its source
  • An engaging coherent story from an untrustworthy source may be remembered and accepted long after the source has been forgotten

Do others believe this information?
“Where’s there’s smoke, there’s fire.” The more often a message is repeated, the more it is accepted. The more people who accept it, the more likely that other people will believe it.

Studies show that the strongest predictor of belief in wartime rumors was simple repetition. Repetition can create the perception of social consensus, a “secondary reality test,” even when no such consensus exists. Because we hear widely shared beliefs more often than weird idiosyncratic beliefs, the familiarity of a belief is often a valid indicator of a true social consensus. But familiarity can be unearned:

  • Repeated statements from the same speaker creates false familiarity and over-estimation of consensus
  • Repetition in a social-media network – an “echo chamber” – can create pluralistic ignorance, a divergence between actual prevalence and perceived prevalence of a belief, or false-consensus effect
  • False-consensus effects can solidify and maintain belief in misinformation
  • Corrections of misinformation involve competition between perceived truth value of correct and incorrect information

The Continued Influence Effect: Retractions Fail to Eliminate the Influence of Misinformation

Retractions rarely, if ever, have the intended effect of eliminating reliance on misinformation, even when people believe, understand and later remember the retraction. Additional clarifications tend to make people rely on the initial misinformation even more. Various explanations exist for this phenomenon.

Mental models
One explanation is that people create mental models of unfolding events: “Factor A led to factor B; factor B in conjunction with factor C caused outcome X.” If factor B is later proved wrong, this leaves a gap in the model, which now won’t “make sense” unless they maintain the false assertion.

Studies of false memory show that people tend to fill gaps in episodic memory with inaccurate but compatible information when it is readily available. It may be that people are more comfortable with an incorrect model than with an incomplete model. The cognitive dissonance created by having an easily-available plausible answer yet knowing that it is wrong is most easily resolved by simply ignoring the fact that it is wrong.

Memory Retrieval failure
With source confusion or mis-attribution, one can attribute information to a final report rather than to a preliminary report, or attribute it to an expert rather than to an uninformed bystander. A second possibility is that when we recall plausible but false information, our strategic monitoring processes – which must determine the validity of any automatically retrieved memory – may fail, and we believe the information to be valid and pertinent. A third possibility is that processing retractions is like a “negation tag” hung onto a memory – “I can fly by flapping my arms – NOT.” The memory may be continue to be activated, but the tag is lost.

Fluency and familiarity
Coherent stories that “feel right” are processed with greater fluency than stories with corrections appended to them. With corrected or retracted facts, we must first remember the original story, then recall the correction. Each repetition of the original story makes it more familiar and more believable. Thus the correction, even when remembered, is always playing “catch-up.” In studies of “myth vs. fact” handout presentations, within 30 minutes readers of the handout identify more “myths” as “facts” than do people who never got the handout to begin with.

Another example; Many companies pay large fees to be officially associated with the Olympics Games. Others use “ambush marketing” and pretend they are associated without paying any fee. Not only is this tactic usually successful, but when attempts are made to expose the fraud, many people remember the false “association” even better than before it was exposed.

Reactance
People don’t like to be told what to think and do, and may reject particularly authoritative retractions or corrections.  Studies show that when jurors are asked to disregard tainted evidence as inadmissible, the request is disregarded more often when accompanied by an extensive legal explanation from the judge than when the inadmissibility is left unexplained.

Reducing the Impact of Misinformation

Pre-exposure warnings
Explicitly warning people that they are about to be given misinformation reduces misinformation effects, but the warning must specifically explain the continuing effects of misinformation rather than merely mention the presence of misinformation. This should be used when one must restate misinformation in order to debunk it. It is also widely applicable in court settings and any situation where people are likely to encounter misinformation, as in advertising and fiction. It is most effective if used before the first time that people encounter the misinformation, but a propagandist or advertiser will never do that.

Repeated retractions
Retraction success can sometimes be enhanced through repetition. Misinformation effects can be alleviated, but not eliminated. Studies show that even with misinformation encountered only once, it persisted equally well whether one retraction or three retractions were given. Additionally, retraction repetition may bring on the “protests-too-much” effect, causing people to doubt the retraction.

Filling the gap: Providing an alternative narrative
When a retraction or correction cause a coherence gap, leaving a piece missing out of an otherwise satisfactory story, filling the gap with an alternative causal explanation can eliminate misinformation’s continuing effect. The alternative explanation must be plausible, account for the important causal qualities in the initial story, and – ideally – explain why the misinformation was initially thought to be correct. It helps if it also accounts for why the initial misinformation was given at all.

Alternative explanations must be as simple and coherent as was the initial misinformation, or people will forget it. When politicians offer alternative explanations, they are often ignored or forgotten because people suspect that unrevealed motivations underlie their explanations and behaviors.

In summary, three established techniques can reduce the misinformation’s continuing effect:

  • Warnings about potentially misleading nature of forthcoming information before it is presented
  • Repetition of corrections
  • Corrections accompanies by alternative explanations, thus preventing causal gaps in the story

Corrections in the Face of Existing Belief Systems: Worldview and Skepticism

Worldview
The more you care, the less you’ll change. The more a piece of information fits into your worldview and cherished beliefs, the less likely you’ll accept a correction to that information.

People more readily accept statements that are consistent with their beliefs, so it should surprise no one that one’s worldview is important in the persistence of misinformation. More Republicans than Democrats are “birthers” and believe that WMDs were present in Iraq. Liberals are less accurate than conservatives when judging the consequences of higher oil prices and “peak oil.”

Misinformation consistent with worldview “feels right” and is easier to comprehend; corrections not consistent with worldview “feels wrong” and is more difficult to comprehend. Studies have not yet been able to determine whether corrections just don’t work on people for whom the correction conflicts with their world view, or if they work equally well among all people, but post-correction differences simply mirror their pre-correction differences.

Self-esteem also jumps into the fray. When people who feel highly-connected to a favorite brand hear negative information about it, their self-esteem declines but their brand loyalty stays high. Those not feeling connected to the brand experience no such reduction in self-esteem.

Making things worse: Backfire effects
When people encounter information that challenges their worldview, they either actively counter-argue against the information or become unmovable – “I guess I have the right to my opinion whatever you say.”

In motivated skepticism, people are highly skeptical of opposing arguments or facts and actively use counter-arguments to deride or invalidate the worldview-incongruent information. This is especially true in politics and religion.

Belief polarization occurs when presentation of the same information increases attitudinal divergence between people with opposing views on a subject. For example, when both Christians and nonbelievers were exposed to a fictitious report disproving the Biblical account of the resurrection, belief increased among believers but nonbelievers became even more skeptical of the Biblical account. Even high levels of education don’t protect against worldview-based rejection of information. Higher education makes Democrats more likely to accept climate change as a fact, and Republicans less likely to accept it. The higher the education level of a Republican, the more likely they’ll believe that Obama is a Muslim. Few Democrats, whatever their level of education, believe this.

In summary, personal beliefs can facilitate the acquisition of attitude-consonant misinformation, increase reliance on misinformation, and prevent the correction of false beliefs. While emotion is a factor, it does not seem to increase resistance to correction. Information which challenges worldview is likely to elicit an emotional response, but emotion by itself is not sufficient to alter people’s resistance to correction.

Taming worldview by affirming it
Corrections and retractions must be tailored to their specific audience, preferably by ensuring that the correction fits into their worldview. People who are “eco-freaks” may be less likely to reject new technology when it’s presented as part of an effort to protect the environment. In polls, Republicans reject “Obamacare” at a much higher rate than they do “Affordable Care Act.”

Christians may fear aliens and undocumented workers less when one appeals to Christian virtues such as: love of neighbor, caring for the helpless stranger, charity, hard work, and honesty. Giving them the opportunity to affirm their basic values – such as writing or talking about a time they felt especially good about themselves because they acted on a value important to them – can make messages of correction more palatable. Studies have shown that distancing oneself from a self-focused perspective (“Don’t think only of yourself”) promotes wise reasoning.

Skepticism: A key to accuracy
It’s better to develop a general attitude of skepticism, rather than try to rouse it after the message is heard. The skeptical attitude wakes up your ability to think “laterally” and deal with non-routine problems. One should think a questioning “hmm” rather than an agreeable “uh-huh.”

Skepticism can reduce susceptibility to misinformation effects if it prompts people to question the origins of information that may later turn out to be false. For example, people who initially doubted that the reason America invaded Iraq in 2003 had anything to do with finding Weapons of Mass Destruction, were more accurate – whether doubting or accepting – with their processing of war-related information in general.

Distrust often has a positive function. One study showed that when participants were showed a face rated as “untrustworthy,” they were more likely to be able to solve non-routine problems on a subsequent, completely unrelated task. In contrast, participants in whom trust was elicited performed much better on routine (but not non-routine) problems. This suggests that distrust cause people to be more careful when exploring their environment, sensitizing them to the existence of non-routine contingencies. Yet another study showed that priming people to be distrustful enhances their creativity in certain circumstances.

These results suggest that a healthy skepticism or induced distrust is of great help in avoiding the traps of misinformation. The benefit seems to come from the ability of skepticism and distrust to prime non-routine “lateral” information processing. This works best when operating at the first exposure to misinformation; less well when applied after exposure. This remains true whether the information was intentionally or unintentionally misleading.

Using misinformation to inform
Studies show that a careful and prolonged dissection of incorrect arguments may facilitate the acquisition of correct information. Such direct refutation of the misinformation is more successful in reducing misconceptions than non-refutational supply of the same information, as in the “myth-versus-fact” approach. Such dissection is a form of argument, and students who engage in argumentation show improvements in conceptual learning over those who don’t.

This also works in politics. An analysis of over 40 opinion polls showed that to win a policy debate, politicians should selectively highlight issues that mobilize public opinion in favor of their position and not engage an opponent in dialogue. Taking the argumentation and refutation approach to an extreme, some have suggested that even explicit misinformation can be used as an effective teaching tool. In one case study, students learned about climate science by studying “denialist” literature – they acquired actual knowledge by analyzing material that contained misinformation in depth and by developing the skills required to detect the flaws in the material. In-depth discussions of misinformation and its correction may assist people in working through inconsistencies in their understanding and promote the acceptance of corrections.

“Debiasing” In An Open Society
In Rwanda, a year-long field experiment showed that a radio-soap opera built around messages of reducing intergroup prejudice, violence and survivors’ trauma altered listeners’ perceptions of social norms and their behavior – albeit not their beliefs – in comparison with a control group exposed to a health-focused soap opera.

SOURCE
[1] Lewandowsky, Stephan, Ecker, Ullrich K.H., Seifert, Colleen M., Schwarz, Norbert & Cook, John (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, Vol. 13(3), 106-131