The Risks of Mental Shortcuts about Risks
Readers — This is a fascinating piece for us to chew on this weekend bySteven Horwitz, the Charles A. Dana Professor and Chair of the Department of Economics at St. Lawrence University in Canton, NY. Chew on! – L
The Risks of Mental Shortcuts about Risks by Steven Horwitz
One of the great things that the Free-Range Kids movement has done is to remind parents that their perceptions of what is dangerous to their kids are often at odds with the statistical facts. The things that we often think are dangerous, like Halloween candy and stranger abduction, are really not meaningful dangers at all. Misperceiving risk is hardly limited to thinking about parenting issues, and misperceiving statistical frequency is a common human problem. For example: what’s the most dangerous part of air travel? Answer: the drive to the airport. You are far more likely to be killed on that drive than on the plane, yet our beliefs about which is riskier are often at odds with that fact.
Cognitive psychologists and behavioral economists have a number of explanations for why we are so bad at thinking like statisticians. These explanations are often termed cognitive “heuristics,” which is just a fancy word for a mental shortcut. We use heuristics all the time when our information is incomplete. For example, we infer that person who is smiling at us is trustworthy. It doesn’t always work, but it’s mostly right much of the time. Using these sorts of heuristics, however, introduces biases into our thinking. The previous example might lead us to be wrong about smiling people being trustworthy in specific instances, as experiences with car salespeople with big toothy grins might indicate.
One of the most important of these biases for the issues facing parents is the “availability bias.” A common shortcut we use when our information is incomplete is to rely on the information that is currently available to us. This is particularly the case when information that is relevant to the question at hand has been in our face recently. So when a friend says “Does it seem to you like everyone’s getting divorced these days?” you immediately think about the two or three divorces that you just heard about and quickly nod your head. But the fact that those examples are easily available to you doesn’t mean that the divorce rate is, in fact, any higher than it used to be. (Statistically, by the way, the divorce rate has fallen slightly in the last 25 years.) Our brains are biased toward the available and we make statistical inferences from those examples, not the total set of relevant information.
For example, if you ask people to come up with 12 examples of their assertiveness then rate themselves on how assertive they are, they will likely rate themselves lower than if they are asked to provide only 6 examples. Why? Because getting the first few examples is easy, but then it gets harder, and when we go to self-rate, we remember the struggle to find those examples and those bias our self-rating downward. With only 6, we struggle less and that experience is less available and therefore does not bias our judgment.
If we turn to judging things like the risk of a stranger abduction or other childhood dangers, one of the main sources of available information is media coverage. If you don’t know the real statistics, you will grab for information that is available, and that’s often that story you saw on TV about the little girl who was kidnapped from her bedroom (or that plane that crashed last week, or that story about some rare disease). As Lenore points out early in Free-Range Kids, the media, both through the “news” and dramas like CSI, focus on the extreme and on the rare, and because our experience of the media dominates our own store of recent information, it feeds right into the availability heuristic/bias. Looking to make a judgment about our children’s safety, we are ripe for falling for that cognitive bias.