Monday, November 3, 2014

Deadly Cows and Living Inside of Whales

Humans aren’t good at estimating probabilities. My sense for how likely something is usually arises from a kind of intuitive hunch about it that comes from the comfort level of the idea. If it feels cognitively uncomfortable, like if someone says, “Eggplants usually grow to weigh more than 20 lbs.,” I’m skeptical. But if it feels familiar or comfortable in my head, like, “Justin Beiber actually can’t play any musical instruments,” I’m inclined to accept it.

Behavioral economists like Daniel Kahneman and Amos Tversky have identified a bias that lives here. It’s called Availability Bias. Formally, the idea is that we are prone to mistake the ease with which something is called to mind, or the subjective comfort we have with an idea, for its objective probability. In one famous experiment, subjects were asked to estimate whether there are more English words that begin with “r” or have “r” as the third letter. Since calling words to mind that start with “r” is easy, most people answer that those are more common. But being able to readily recall words that start with the letter is a quirky feature of the human brain. We’re good at that, just like we’re pretty good at memorizing 7 digit strings of numbers like phone numbers, but 13 digit strings are much harder. Our brains are not well-equipped to do a systematic search of words with “r” in the third position, but we mistake that subjective difficulty as representative of their real frequency in the world.

The mistake comes out in lots of places. During Shark Week on the Discovery Channel, people are more afraid of shark attacks and estimate their likelihood as higher. Cows, it turns out, are the real menace to society. You are more than 20 times more likely to be killed by cows than by a shark. People think that zombies and vampires are more plausible now than they did several decades ago when they weren’t such a large part of pop culture. We all drive more conscientiously for the weeks after someone we know has been in a serious car wreck, and we watch our salt intake more closely when a relative has recently had a heart attack.

Now I’m going to suggest something that may offend. Why is it that so many people who are otherwise quite reasonable, and who would never believe similar claims out of context, claim to actually believe outlandish stories like Jonah and the Whale, the Genesis Creation story, the claim that Noah lived to be 900 years old, the story of Joseph Smith being visited by an angel who gave him the book of Mormon, the story of Mohammed being visited by angel who gave him the Koran, and Paul having a seizure and hearing the voice of God? If comparable stories were offered with different, unfamiliar details, I think most Americans would be skeptical at the very least. In fact, if you’re the typical mainstream American Christian, you are skeptical about the Mohammed story and the Joseph Smith story. But tens of millions, perhaps hundreds of millions of Americans take these claims to be true. I take it as pretty obvious that they are not. And I think it’s availability bias that has artificially elevated the plausibility of them in people’s minds. How did these stories become so available to so many people? When people hear such stories over and over, hundreds or thousands of times through childhood and into adulthood, and when the stories are treated as momentous, or even historical, it has the same effect on our probability judgments as Shark Week. When we see portrayals, reread the stories, and hear them repeated and treated with reverence thousands or tens of thousands of times, they come to feel familiar, vivid, and, ultimately, probable. The readiness with which the idea can be called to mind gets mistaken for its reality. I don’t think it is too much to suggest that our widespread acceptance of the story of Jesus’ return from the dead can be attributed, at least in part, to availability bias too.

I’m not offering an argument here for why I think these claims are false. I think you’ll already agree with me in the abstract that it is exceedingly unlikely that a person could survive for several days after being eaten by a whale. And humans do not, as far as we know, ever live much past 100 years, even with all of the benefits of modern medicine and health research. And the religious example aside, you’ve never heard a single plausible report of a human actually being biologically dead and then returning to normal living function after three days. You already share my skepticism about other claims like these. In other contexts, you already agree that such claims are outrageous. So why is it that the religious examples don’t feel more cognitively uncomfortable? Availability bias.

Here’s a meta-level way to think about your own disposition to make this mistake (and lots of others). We all possess nervous systems that do a pretty good job of solving some problems. But we know that when we put them in certain environments, or frequently expose them to certain kinds of stimuli, like mythological stories, those exposures skew the system’s capacity to sort true from false, probable from improbable, reasonable from unreasonable. And knowing that we are built that way facilitates our efforts to compensate and correct for the bias. 

Matt McCormick
Department of Philosophy
Sacramento State


  1. Matt. Yes, availability bias must be a big part of the story for why otherwise sensible people accept such clearly false claims as Jonah living after being eaten by a whale. I myself was once eaten by a whale, and I did not survive as a physical body, but I am now writing this sentence from heaven, and am sending it down to my avatar that teaches at Sac State and can post remarks in the Dance of Reason.

  2. Matt, this is very interesting, thanks. Let me try to get something straight by making this personal. Are you saying that stories like the resurrection, because of the cognitive ease induced by frequent repetition, even have that effect on you, an arch atheist? It seems to me that the answer to this must be yes. (I suspect you will even allow that your massive exposure to science fiction requires you to make the kind of correction you prescribe in the last paragraph.)

    If so, then it seems like what you are saying opens up this line of thought: In fact, the availability heuristic isn't necessary to explain why followers of a particular religion accept these stories as truths. Sure, they get a feeling of cognitive ease when they consider them, but, for the vast majority, the reason for that isn't repetition, but that they are encoded as knowledge: they were taught these stories as truths from a very early age by people they trust. (Kahneman and Tversky, after all, introduced the availability heuristic to account for how we estimate the probability of a claim when we don't have access to the relevant information.)

    So it seems to me that what you are really explaining is why non believers within a believing culture find it so difficult, despite their best efforts, to adopt a properly dismissive attitude to these stories. This could help to explain why atheists in our culture are sometimes inexplicably angry. They resent how much power the stories have over them and the guilt they really feel when they proclaim them false.

    1. Thanks Matt, I agree with Randy that it has to be more than exposure to the idea. Another factor is likely to be exposure to the judgments of others that the story is true. When everyone around you thinks that something is true, including all of your relevant "thought leaders", then you don't need to waste cognitive resources assessing the belief. If you assess and agree, then you've wasted your time. If you assess and disagree, then you have just ostracized yourself from your community. Any assessing of the beliefs of your communities thought leaders might even lead to punishment if the moral value of respect for authority is taken very seriously in said community.

  3. Thanks for the replies. Lots of good ideas here. Yes, Randy, I think lots of non-believers find themselves protesting too loudly about outlandish religious ideas in part because those ideas have more hold on their minds than they should. No one in a predominantly Christian culture gets all bent out of shape trying to reject ancient Mesopotamian myths because they aren't "live" options, to use William James' term.

    Also, there is some research, that I can't find at the moment, that shows the mere repetition of some trope, whether it's portrayed as fiction or truth, has the effect of elevating people's acceptance of it. That's why, for instance, the notions that Obama isn't an American citizen, Obama is a Muslim, or that 9-11 was a conspiracy persist. Even bringing those up in order to debunk them has the effect of elevating their plausibility and giving them more traction or mind space. People forget the context, the debunking, and the source, but they vaguely remember hearing something about Obama's being a Muslim. There's really very little need to repeat your message as truth, or work very hard at getting people to encode it as knowledge, as you say. You just need to say it again and again, and the cognitive comfort takes hold. That's why 17% of Americans still think he's a Muslim and over 30% of Republicans think he is because it fits so comfortably within the rest of their belief structures about him. Kahneman put the fancy research on the point, but Orwell nailed it long ago.

  4. Matt, right, Kahneman provides an interesting example in which you don't even have to complete the entire message. Subjects who kept hearing the phrase "The temperature outside is" are apparently likely to assign a higher probability to the claim "The temperature outside is 120 degrees" than people who did not.

  5. Oops. I lost my first post. Here's my attempt to reconstruct it more briefly. Might AB help explain any of the following cases as well, if we filled in the details properly? (Some of these, I expect you'd say 'yes' to based on what you've already written here.)

    1. why natural explanation N1 seems more reasonable than natural explanation N2.
    2. why super-natural explanation S1 seems more reasonable than super-natural explanation S2.

    3. why natural explanation N1 seems more reasonable than not having any explanation at all.
    4. why super-natural explanation S1 seems more reasonable than not having any explanation at all.

    5. why natural explanation N1 seems more reasonable then super-natural explanation S1.
    6. why super-natural explanation S1 seems more reasonable than natural explanation N1.

    7. why natural explanation N1 seems more reasonable than any super-natural explanation.
    8. why super-natural explanation S1 seems more reasonable than any natural explanation.

  6. Greetings, Professor McCormick. Thank you for the post. I found it intriguing.

    I’m wondering if you would concede that availability bias can be valuable in certain circumstances. Suppose that some virus infected a large population of sharks causing them seek out and attack human beings in shallow waters. Further, suppose this infection coincided with Shark Week. While the heightened fear of sharks might be irrational due to availability bias, it would also be fortuitous due to the increase in danger. So, more people would stay out of the water for bad reasons, but more would also stay alive. The same might be said for the belief in the resurrection of Jesus. Even if it is based on the availability of the story, belief in the truth of the story may have eternally beneficial consequences.

    I’m not advocating irrationality. I’m happy to vigorously wave the flag of reason right beside you. However, if people are going to be irrational, I’d rather the result be in their favor. I’m sure you’d agree that, in order to get at whether the result is beneficial, the claims must be evaluated on their own merit. Is it at least fair to say, however, that there are times when availability bias might save the day?

  7. Good point Allen. Right, the evolutionary value of a belief or bias may or may not track with the truth. Dennett and McKay have an important article on the evolution of misbelief where they show lots of cases where believing something false could be advantageous.

  8. Hello Professor McCormick,

    I was wondering if subjective comfort is being used synonymously with subjective probability? If it is not, does one lead to another before being mistaken for objective probability?

    For example, does the fact that I have not seen many 20 lb egg plants set my level of subjective comfort about egg plants? Or, if a person has fooled themselves into believing they have seen many miracles, in combination of hearing about many miracles, set their subjective comfort about the christian god? Or does their subjective comfort make them prone to see miracles? Another example, does my lack of evidence (even though I may not be looking for any) for showing Obama is not a Muslim, set my comfort level about Obama being a Muslim, before I set a mistaken objective probability (sp1 > sc1 > sp2)?

    If there is a distinction between the two terms, can one come before the other at any time, or does one have to come before the other at all times?

  9. Thanks Michael. Interesting questions. I'm not using subjective comfort with subjective probability, although I am suggesting that they can be closely tied in some cases. Yes, I think in many cases, repeated exposure to some story, scenario or event leads people, unless they are diligent and thoughtful, to have a higher estimation of the probability of that event's occurring or being real. I'm not clear on some of your questions, but I think that being embedded in a religious culture and hearing lots of accounts of miracles, for instance, then leads to its being easier to interpret new events as miracles. And those new "miracles" then contribute to strengthening one's confidence about the religious culture accounts you are hearing. My suspicion is that the empirical evidence bears this out, but it would be hard to tease out all of the issues. Hope that helps.