A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions

From Abracadabra to Zombies | View All

availability error

Most important human judgments are made under conditions of uncertainty. We use heuristics, or rules of thumb, to guide us in such instances as we try to determine what belief or action has the highest probability of being the correct one in a given situation. These rules of thumb are often instinctive and irrational. Social psychologists such as Thomas Gilovich, Daniel Kahneman, and Amos Tversky have studied several important heuristics and discovered errors associated with their use. One of these heuristics is the availability heuristic, determining probability "by the ease with which relevant examples come to mind" (Groopman 2007: p. 64) or "by the first thing that comes to mind" (Sutherland 1992: p. 11).

The problem with the availability heuristic is that what is available at any given time is often determined by factors that lead to an irrational or erroneous decision. Dr. Jerome Groopman gives the example of a doctor who had treated "scores of patients" over a period of several weeks with "a nasty virus" causing viral pneumonia. Then a patient presented herself with similar symptoms except that her chest x-ray "did not show the characteristic white streaks of viral pneumonia." The doctor diagnosed her as being in the early stages of the illness. He was wrong. Another doctor diagnosed her correctly as suffering from aspirin toxicity. The diagnosis of viral pneumonia was available because of the recent experience of many cases of the illness. Had his recent experience not included so many cases of viral pneumonia it is likely the doctor would have made the right diagnosis. After he realized his mistake, he said "it was an absolutely classic case--the rapid breathing, the shift in her blood electrolytes--and I missed it. I got cavalier."

Groopman provides another example of the availability heuristic leading to misdiagnosis and error, this time in the emergency room. A physician in the emergency room must make many quick life-or-death decisions, but "being quick and shooting from the hip are indications of ... availability" (Groopman 2007: p. 75). If a physician seems to make a snap judgment, Jerome Groopman advises us to ask the ER physician "What's the worst thing this can be?" This, he says, "can slow down the doctor's pace and help him think more broadly."

The availability error explains, in part, the irrational behavior of those who, after 9/11, assaulted anyone they thought looked Middle Eastern. It explains, in part, the current rash of attacks on Mexicans and Mexican Americans in the U.S. Some politicians and some journalists have made immigration a hot-button issue and made ethnicity available as a reason for venting frustration at the economic situation in the country.

Lotteries do not try to sell tickets by emphasizing the statistical odds any ticket has of winning. Those who advertise lotteries do not want the first thing that comes to a potential ticket-buyer's mind the thought that he has a one-in-40-million chance of winning. Instead, they put forth recent winners in the hope that what will come to mind when the chance to buy a ticket arrives is the happy winner. A person is more likely to buy a ticket if the first thing that comes to mind is winning rather than losing. (You've probably got a better chance of being killed in a car accident driving to buy your lottery ticket than you do of winning the lottery.)

No matter how much knowledge one has, one's experiences can undermine that knowledge when it comes time to apply it in a concrete situation. Experiences with deep emotional impact will affect one's judgment and trump one's knowledge. Your brother was killed in a plane crash so you decide to never fly in an airplane. But you drive thousands of miles a year to do concerts, even though the odds of your being killed in a car crash are significantly greater than the odds of your being killed in an airplane crash.

Anything that creates a vivid image is likely to override other, perhaps more rational, choices one might make. Advertisers don't worry that many of their words make no sense when looked at carefully. What matters are the images. Stuart Sutherland claims there are studies that show people can remember thousands of photographs a week after seeing them just once (1992: p. 15). How many words would we remember from a list of thousands a week later? Images stick in the mind, whereas words are often quickly forgotten. Seeing the Rodney King beating over and over again or seeing the Humane Society undercover film of cattle being tormented in a meat-processing plant can affect a viewer's judgment profoundly. But are the films representative of a general problem with police brutality and animal abuse, or were they aberrations? Even if the films completely misrepresent the truth, the images will overpower any words that try to make that point.

For most people, concrete data is more available than abstract data. Some think this is why the solution to the Wason problem goes up when put in concrete rather than abstract terms.

The stock market is another place where the availability error exemplifies itself. Most people wouldn't think of buying a stock that has recently fallen in value, yet a good way to make money in the market is to buy low and sell high. (It's not the only way, of course. You can make money by earning dividends and holding on to a good stock for a long time or you can do it the way Martha Stewart did with insider information.) Yet, most people will only consider buying a stock if it's doing well, i.e., at a high value. Some people, apparently, buy stock on the advice of their hairdresser or of a stranger who sent them an email. The advice is concrete and readily available, but probably wrong.

As a teacher, I experienced a kind of reverse availability problem. After the fourth or fifth student during a term had told me that they missed an exam because a grandparent had died, I became suspicious. I usually had about 150 students a semester and some of them were old enough to be grandparents themselves. My guess is that they couldn't have had more than 400 grandparents among the lot of them. What are the odds that four or five of 400 grandparents of students in my classes in the Sacramento area would die within a 16-week period? I had no idea, so I said nothing. But it did give me pause.

One of the things that is disturbing about the availability error is the ease with which we can be manipulated by writers, filmmakers, pollsters, or anybody who presents us with a stream of words or images over whose sequence they have control. The order in which ideas, words, and images are presented affect our judgment. Sutherland notes experiments done by Solomon Ash that demonstrated that earlier items influence judgment more than later items in a sequence. Earlier items are more available to our minds than later items. It has been known for some time that you get different results when you reorder questions in a poll. Earlier answers influence later ones. In my critical thinking text, I advise that when evaluating extended arguments one try to read the argument at least once without making any judgments about the claims made. The reason for this is that once you start classifying or categorizing items it will affect how you understand and evaluate later items. In short, you will bias your judgment if you start making judgments too early. This is true of any kind of investigation. If you make an early assessment, it will color your later evaluation of items and you will often find that your seemingly brilliant work was simply confirmation bias.

First impressions make lasting impressions.

See also affect bias, anchoring effect, representativeness error, and the hidden persuaders. My Unnatural Acts blog also has an entry on the availability bias.

further reading

Ariely, Dan. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. HarperCollins.

Gardner, Daniel. 2008. The Science of Fear: Why We Fear the Things We Shouldn't--and Put Ourselves in Greater Danger. Dutton.

Gilovich, Thomas. 1993. How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Free Press.

Gilovich, Thomas. Dale Griffin and Daniel Kahneman. 2002. eds. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press.

Groopman, Jerome. M.D. 2007. How Doctors Think. Houghton Mifflin. My review of this book is here.

Kahneman, Daniel. Paul Slovic, and Amos Tversky. eds. 1982. Judgment Under Uncertainty: Heuristics and Biases Cambridge University Press.

Kida, Thomas. 2006. Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Prometheus.

Levine, Robert. 2003. The Power of Persuasion - How We're Bought and Sold. John Wiley & Sons.

Slovic, Paul. 2000. The Perception of Risk. Earthscan.

Sutherland, Stuart. 1992. rev. 2nd ed. Irrationality. Pinter and Martin.

Last updated 30-Dec-2013

© Copyright 1994-2012 Robert T. Carroll * This page was designed by Cristian Popa.