Monday, 01 September 2008

Understanding our minds: Fear as an example of cognitive biases at play




In my last post, I made mention of how some people's lives are ruled by fear. I briefly talked about how their (and everyone elses') choices are made without rationally weighing up the reality of the situation. Obviously, I am not advocating that humans should aspire to being purely rational beings, as in doing so, we would likely lose some of what makes us human. However, what I am saying is that we need to be aware of how often we distort the truth, especially when confronted with fearful situations.

In stressful or fearful situations, we are more likely to throw reason out the window and react with our gut feelings. Put another way, we are more likely to fall back on in-built biases (a.k.a. heuristics) in our psyche as they are so innate to who we are. Examples of this might be taking the advice of a person in a perceived position of power over the advice of someone else, or using the crowd's actions as an indication for what we should be doing (see herd instinct).

When a crisis hits, whether real or imagined (see moral panics), we are more likely to fall back on these "rules of thumb". These rules, or heuristics, are some of the first we learn, and to some extent, have probably been ingrained in us through evolutionary processes, as they serve to protect us in a dangerous world where we have limited information. For example, in a situation in the past, if I saw a crowd of people (say, my tribe), running in the opposite direction to myself, I could take this as a pretty clear indication that I did not want to find out what they were runninng away from. You wouldn't even need to think twice to make the decision to turn around and run away with them. Situations of this nature, repeated over millenia, likely helped remove those brave souls who steadfastly continued on their path towards danger rather than joining the group and running away (today we give these folks Darwin Awards). Perhaps this gives us an indication as to how our sheep-like tendencies formed?

Similarly, we have been taught since birth to accept the advice or do what we are told when instruction comes from a "higher authority". Most of us wouldn't be here today if we hadn't listened to our parents and teachers when they told us not to eat that bright purple rat poison (okay, an extreme example), or look both ways before crossing the road. At the most simple and earliest stages of our lives, we rely upon our parents absolutely, and this teaches us to listen to authority. I am not pointing this out in an anarchist sense (i.e. we need to rebel) or in a conspiracy theory-sense (i.e. we are all being mind controlled). That's just the way it is, and it is a basic rule of thumb that we carry with us throughout our lives, and is generally employed at a sub-conscious level.

Anyway, the point of this is that people are more likely to throw reason out the window when confronted with fearful or stressful situations, and there is a very real explanation for this from evolutionary psychology and cognitive perspectives (see cognitive psychology and computational theory of mind).

Daniel Gardner has written a book about this subject (I haven't read it though), called The Science of Fear (there's no Wiki on the book unfortunately). Here's the marketing blurb on the book:

"From terror attacks to the war on terror, real estate bubbles to the price of oil, sexual predators to poisoned food from China, our list of fears is ever-growing. And yet, we are the safest and healthiest humans in history. Irrational fear seems to be taking over, often with tragic results. For example, in the months after 9/11, when people decided to drive instead of fly -- believing they were avoiding risk -- road deaths rose by more than 1,500.In this fascinating, lucid, and thoroughly entertaining examination of how humans process risk, journalist Dan Gardner had the exclusive cooperation of Paul Slovic, the world renowned risk-science pioneer, as he reveals how our hunter gatherer brains struggle to make sense of a world utterly unlike the one that made them. Filled with illuminating real world examples, interviews with experts, and fast-paced, lean storytelling, The Science of Fear shows why it is truer than ever that the worst thing we have to fear is fear itself."

Here's an extract from one of the Amazon reviews, which captures much of what I am saying or touched on in the previous post:

"Gardner is eager to have us understand how these Systems work. He contends that we are carrying a reaction system founded on our ancestors' time on the African savannah. Our brains haven't adapted to the fast-paced, high technology world around us. We are reacting almost entirely with The Gut, and we are making serious mistakes as a result. Are we truly under threat from the things we claim to fear? He cites numerous cases, from the fear of "man-made" chemicals through the spectre of cancer to the possibility of our children being assaulted by strangers. Each of the topics is introduced with our given views - usually captured by polls, then carefully assessed by examining the real odds. In every case, the important things to consider almost certainly haven't been. The breast cancer campaigns have uniformly overlooked the role of age in determining the likelihood of its occurrence. 

The calculations leave little doubt that we are far too often looking at threats with little consideration of their true nature. Why are we reacting so readily with The Gut instead of with The Head? In no small part, Gardner argues, media, politicians and industry play a significant part. Media, anxious to sell its products, emphasizes the violent, the extreme and the bizarre. The result, of course, is that's what captures our attention. The bombardment of such stories, often unthinkingly repeated by politicians, is a reinforcement of The Gut's reaction to this kind of information. Never seeing a rational analysis of such news, we lose any sense of proportion about what is truly important. We rarely find the opportunity to consider an issue rationally before the next one is upon us."

I would imagine that Naomi Klein's The Shock Doctrine would be along similar lines, whether she realises it or not - hopefully she does :)






No comments: