Katherine Thompson Gives GCRI Public Lecture On Psychology Of Uncertainty

In GCRI’s first public lecture (26 November 2012), Katherine Thompson spoke on the psychology of uncertainty in a talk titled “What We Think About When We Think About Probability: How Our Experience Affects the Way We Perceive the Risk of Rare Events”. Katherine is a PhD student in Psychology at Columbia University and researcher with Columbia’s Center for Research on Environmental Decisions, a group I’m also affiliated with. She’s been working on, among other things, the psychology of disaster preparedness [1].

Global catastrophic risk is fundamentally about uncertainty. We do not know whether there will be a nuclear war, or a pandemic, or some other catastrophe. We do not know how bad the catastrophes would be if they occurred. And we do not know how effective our actions would be at helping out. Our ability to characterize and communicate this uncertainty goes a long way towards figuring out how to respond to the risks. This holds for risks of all sizes, not just the GCRs, but GCRs pose some extra challenges because they are generally rare and outside our past experience.

Psychologists like Katherine and her colleagues have been for many years studying how people think about uncertainty. A lot of the results are not encouraging: people tend to make lots of basic mistakes when estimating the probability of something happening or when interpreting other people’s estimates. For example, people tend to tend to overreact when given descriptions of possible-but-rare events but underreact to the same possible events when deciding based on their own experience with the events [2].

The struggle with probability holds for both laypeople and experts. For example, Katherine’s research has looked at the probability of earthquakes in California, finding that seismologists and geophysicists make the same mistakes as laypeople. Specifically, they both overestimate the probability of a big earthquake happening within the next few weeks (a very low probability) and underestimate the probability of it happening within the next few decades (an almost certain outcome). This is a humbling finding for us researchers, who like to think of ourselves as being able to avoid these types of mistakes!

One research result that intrigues me comes from construal level theory (CLT). CLT says that we construe things as being closer or farther away in psychological distance. Things are construed as being more distant if they are farther away in space or time, more uncertain, or more abstract. Katherine gave us the example that “studying for a midterm” is more abstract than “re-reading class notes”. People are more likely to act on things that are low-construal.

This presents a big challenge for challenge for GCRI, given our mission of mobilizing the world’s intellectual and professional resources to meet humanity’s gravest threats. Global catastrophic risks are pretty high-construal things: they cover broad ranges of space and time and they’re uncertain. Furthermore “address global catastrophic risk” is a very abstract endeavor. I think a core goal for GCRI is to translate “address global catastrophic risk” into less abstract, more concrete actions like “turn your lights off when you leave the room”, “create a system for rapid flu vaccine development”, “subscribe to our newsletter”, or “donate to us”. I guess we have our work cut out for us…

Here’s the full abstract for the talk:

Psychology research has identified numerous ways that people misinterpret, misuse, distort, or neglect probabilities when making decisions under uncertainty. Prospect Theory predicts that very rare events will be over-weighted: that very low probabilities will be treated as higher than they actually are. This effect would seem to predict that people who hear about the probabilities of extremely low-probability, catastrophic events should overreact, over-prepare, and evacuate early. Although field reports do support some of these predictions—see, e.g., the evacuation “halo” effect—on the whole it seems that the opposite bias is true: that people under-prepare for catastrophic hazards, and tend to evacuate late if at all. But the predictions of Prospect Theory hold true only when probabilities are described, or given explicitly; when people learn about probabilities through their own experience, they tend to under-weight rare events. The last decade has seen much attention to this “Description-Experience (DE) Gap” effect in the laboratory, but only recently have researchers begun to test the DE Gap in more real-world-like situations. My current research is working to push the external validity of the DE Gap farther into the real world, and to better understand how people make decisions in complex situations in which a mix of description and experience are available to draw upon.

The presentation was hosted online via Skype. There were nine people in the audience, including Alexandra Witze, a Boulder, CO-based science writer developing a book on the 1783 Laki eruption, and GCRI’s Jianhua Xu, who was joining us from Beijing. Other attendees were GCRI’s Arden Rowell, Tony Barrett, Grant Wilson, Mark Fusco, Tim Maher, Jacob Haqq-Misra, and myself, making for a productive, interdisciplinary conversation.

For the previous GCRI talk (which was private), see Wei Luo Talks To GCRI About Geo-Social Visual Analytics And Pandemics.

[1] See her syllabus for a psychology of disaster preparedness course: http://www.columbia.edu/cu/psychology/courses/3285/3285_Z12.pdf

[2] Hertwig, R., & Erev, I. (2009). “The description–experience gap in risky choice” Trends in Cognitive Sciences, 13, 517-523. http://ie.technion.ac.il/Home/Users/erev/Hertwig_Erev_2009.pdf

This post was written by
Seth Baum is Executive Director of the Global Catastrophic Risk Institute.
Comments are closed.