Etherless Learning

learning everywhere, all the time

Posts Tagged ‘Backfire effect’

Less identity, more ideas

Posted by Ming Ling on November 10, 2012

Once again, “we would all benefit from more meaningful interaction and less labeling… along any dimension by which we divide humanity.”

From Tom Jacob’s “America’s Increasingly Tribal Electorate“, describing political scientist Lilliana Mason’s research:

“behavioral polarization”—anger at the other side, activism for one’s own side, and a tendency to look at political arguments through a biased lens—is driven much more strongly by that sense of team spirit, as opposed to one’s views on public policy.

According to her:

the only way to reduce the anger and bias would be “to reduce the strength or alignment of political identities.”

Yet I remain hopeful that, in spite of the dangers of the backfire effect, we can find ways to separate ideas from identities, and share knowledge both dispassionately and compassionately at the same time. As before: “Most of all, we should put wrongness back in its place– linked to the idea, not the person,” or the identity.

Advertisements

Posted in Reasoning | Tagged: , , | Leave a Comment »

Distinguishing science from pseudoscience

Posted by Ming Ling on November 15, 2010

Here’s another excellent reminder of the importance of responding to others’ different beliefs gently, in “The 10 Commandments of Helping Students Distinguish Science from Pseudoscience in Psychology“:

Gently challenge students’ beliefs with sympathy and compassion. Students who are emotionally committed to paranormal beliefs will find these beliefs difficult to question, let alone relinquish. Ridiculing these beliefs can produce reactance and reinforce students’ stereotypes of science teachers as closed-minded and dismissive.

Summary of commandments:

  1. Delineate the features that distinguish science from pseudoscience.
  2. Distinguish skepticism from cynicism.
  3. Distinguish methodological skepticism from philosophical skepticism.
  4. Distinguish pseudoscientific claims from claims that are merely false.
  5. Distinguish science from scientists.
  6. Explain the cognitive underpinnings of pseudoscientific beliefs.
  7. Remember that pseudoscientific beliefs serve important motivational functions.
  8. Expose students to examples of good science as well as to examples of pseudoscience.
  9. Be consistent in one’s intellectual standards.
  10. Distinguish pseudoscientific claims from purely metaphysical religious claims.

I think the implications of these guidelines extend well beyond psychology into the nature of science more generally, and into methods for helping the broader public evaluate the connection between belief and evidence more critically. Guidelines #6 and #7 are especially valuable for describing how to do this respectfully and kindly.

Posted in Reasoning | Tagged: , , | Leave a Comment »

When discussing risk backfires

Posted by Ming Ling on November 4, 2010

On “More Talk, Less Agreement: Risk Discussion Can Hurt Consensus-Building on Science/Technology“:

When it comes to issues pertaining to science and technology, “talking it out” doesn’t seem to work. A new study shows that the more people discuss the risks and benefits of scientific endeavors, the more entrenched they become in their viewpoint, and the less likely they are to see the merits of opposing views.

Still more evidence on how people become more entrenched in their views upon actively considering contradictory information and perspectives. We really need to learn more about how emotion and identity influence these discussions, and develop better techniques for listening and communicating.


Andrew R. Binder, Dietram A. Scheufele, Dominique Brossard and Albert C. Gunther. Interpersonal Amplification of Risk? Citizen Discussions and Their Impact on Perceptions of Risks and Benefits of a Biological Research Facility”. Risk Analysis, 29 Oct 2010 DOI: 10.1111/j.1539-6924.2010.01516.x

Posted in Reasoning | Tagged: , , , | Leave a Comment »

Dealing with the "scientific impotence" excuse

Posted by Ming Ling on October 31, 2010

On “Five minutes with the discoverer of the Scientific Impotence Excuse“:

When people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings. They will often judge that the topic at hand is not amenable to scientific enquiry [and embrace] the general idea that some topics are beyond the reach of science.

Anyone who seeks to educate, inform, or influence, take note of these techniques to avoid backfire or unwarranted discounting:

  1. Affirm people’s values first.
  2. Frame findings to be consistent with their values.
  3. Present findings in non-threatening ways.
  4. Speak with humility.
  5. Say “discover” instead of “disagree”.
  6. Decrease in-group/out-group salience.
  7. Provide an alternate target for negative emotions.
  8. Teach critical thinking and metacognition in safe settings.

What I really appreciated was the research-based guidance for how to address this resistance to scientific evidence, in the second section of the interview (as summarized above). Misunderstanding the distinction between evidence and belief contributes to the problem, but it may not be so obvious how to highlight that distinction productively. As Munro pointed out, Cohen, Aronson, and Steele’s (2000) research demonstrates one way to resolve this tension, as does some of his own research (which unfortunately didn’t get cited directly in the interview). I think this is an extremely important set of findings because it’s so tempting for people to come down hard on those who “just don’t understand,” lecturing authoritatively and perhaps conveying frustration or even attacking their perspectives.  Unfortunately, that can backfire. Instead, this research shows that a gentler approach can actually be more effective. I take heart in that.

Posted in Reasoning | Tagged: , , , | Leave a Comment »

Difficulties of accommodating discrepant information

Posted by Ming Ling on August 24, 2010

On “The Wrong Stuff – Reasonable Doubt: Innocence Project Co-Founder Peter Neufeld on Being Wrong“:

I think generally speaking it’s difficult for people to admit they’re wrong, and the higher the stakes, the more difficult it becomes. So what you really want to do is educate people that it’s OK to be wrong. It doesn’t mean you’re a fool. It’s not going to be the end of your life.

There are high social costs to being wrong, and creating a culture that values thoughtfulness and humility rather than tenacity may alleviate this phenomenon. (Ironically, one might expect this to be worse in a collectivist culture, where there could be more shame, surprise, or negative attention attached to retracting publicly stated beliefs. In contrast, individualistic cultures that celebrate different ideas might be more tolerant of changing one’s mind.)

But I think there are high cognitive and metacognitive costs to being wrong as well. Part of it could be a consequence of generating a hypothesis or belief, akin to the dangers of convincing oneself of the correctness of a guess (e.g., when taking a pretest). The more a person articulates or mentally rehearses an idea, the more s/he becomes committed to it (i.e., strengthens the memory trace, elaborates on potential explanations, draws connections to prior knowledge).

Further, someone whose self-concept is strongly linked to having the right answers might feel more threatened by realizing s/he made an error. And someone who thinks that intelligence is knowing facts rather than exercising good reasoning would probably be more disturbed by having to acknowledge getting the facts wrong.

So what does this suggest? Perhaps we should encourage more tentativeness and skepticism, an appreciation of the probabilistic nature of knowledge, comfort with staking cautious claims. Maybe we should ask people to propose multiple conditional hypotheses instead of single predictions. And most of all, we should put wrongness back in its place– linked to the idea, not the person.

Posted in Reasoning | Tagged: , , , , | Leave a Comment »