Tuesday, 23 September 2008

Conspiracy theories and their cognitive basis


Below is a response I typed up to an email about conspiracy theories:

I have a problem with most so called "conspiracy theories" and their relatives, mostly because they all assign intentional causality to events i.e. they assume that events have been orchestrated and planned by some form of illuminati or lizard people. While I don't always dispute that events happen the way they do and that they are caused by people, I think that it is more likely that these events are emergent phenomena that occur due to the complex interactions of the people working within a specific system.  

It's easy to say that Rupert Murdoch (owner of a huge chunk of international news and other media through his News Corp) has evil plans for the domination of world culture, but what is more likely the case is that he, like everyone else, believes that his way of thinking is best and so runs his business and makes investments based on his own personal ethos. Unfortunately, the knock on effect of this is that any decision he makes affects millions due to his position of power and influence. Some like to think that he sits behind the scene pulling the strings, but all he is doing is acting like many others do to create the world they would want to live in, with the difference being the scale of the stage he is acting upon. 

Conspiracy theorists operate with the "hindsight is 20-20" rule. They essentially pick and choose pieces that make a great story. They are subject to confirmation bias, as are we all. Their brains essentially filter out non-relevant pieces of info and only take note of those things that support their central hypotheses e.g. "man didn't land on the moon". We all do this, but most of us don't focus on the perceived global domination by an elite few. 

Depending on how you look at most situations, you can either imply some kind of intentional causation or you can view people as fallible and human (and thus subject to human desires and a human level of understanding), which is probably a more realistic view since no matter who you are, or who we are talking about in history, every person is subject to the same limitations that make us human. It doesn't matter whether you are Ghandi, Mother Theresa (another story all together), Bill Clinton or Emperor Constantine. 

Anyway, that's a very, very rough and convoluted reason for why I tend to discount things like the templar myths and the like. Granted, I haven't read as much on the topic as I maybe should before dismissing it outright but I stuggle to pass up an opportunity to rant... :) 

P.S. While it's true that you can't believe everything you read on Wiki, as you say, it is generally annotated with footnotes and its self-regulating nature actually generally leads to a more robust source of knowledge than any single person could create. If you look at the comparisons between Wiki and other encyclopaedias like Brittanica, etc. they actually have similar levels of factual errors, while Wiki has a greater depth of information that you could only gain by having someone that is truly passionate and knowledgeable on a subject write about it.  

Are you familiar with the wisdom of the crowd? I think this phenomenon comes into effect in Wiki - another emergent property of the system if you will. 

P.P.S. Check out a larger list of some of the cognitive biases that we employ in our minds every second of every day. These biases act as the lenses through which we view the world and every single one of us is subject to them (generally without our realising it). Essentially, our brains have limited processing power (we can process a lot, but there is a limit), and we are subjected to more stimuli in any one second than our brain can process, hence we have these mental shortcuts to help us get through the day by making the job of processing everything simpler. Some call them biases, others call them heuristics, but these are essentially rules of thumb that we employ subconsciously in order to process more information and stimuli more quickly.  

They helped our ancestors survive by cutting out a lot of the hard work. For example, we have a predisposition (or bias) towards listening to a person in a position of authority. This is useful when you are on the plains of Africa and you spot those poisonous berries that your mother told you not to eat. Rather than test that hypothesis yourself, its a lot easier (and safer) to assume what your parents say is correct. 

Another example: we are predisposed towards going with the crowd (known as the bandwagon effect). We assume that if many people say something is true, then it must be. This saves us the hard work of having to find out for ourself. 

The point is that conspiracy theories generally come about as a result of our complex brains acting in a way they have evolved to act, but in this case, the outcome is a deviation from the truth rather than a closer representation of the actual truth. 

Anyway, I am really rambling now :P 

P.S. the image for this post comes from this Worth1000 entry

Wednesday, 10 September 2008

Steven Pinker on morality


Steven Pinker wrote an article on morality for the New York Times in January this year, called The Moral Instinct. It's a great summary of how morals and ethics could have evolved from a naturalistic, evolutionary perspective. I bring it up now, because it has been sitting in the back of my mind ever since I read it at the beginning of the year. Morals represent one of the remaining "strongholds" of the religious (or so they believe). They argue that you can't have morals without God. It's a case of the god of the gaps phenomenon in action - god has literally been squeezed out of most areas of life leaving religious apologists holed up in the more insubstantial, conceptual areas such as morals.

Then, along comes Steven Pinker and co, and they tear apologists' arguments in this area to shreds. I attended one of Pinker's lectures in January right after having read the article. He basically repeated his Authors@Google lecture verbatim which really disappointed me. However, afterwards we went up and asked him a few questions, which he dutifully answered. I then mentioned that I had just read his New York Times article and found it fascinating. It was very interesting to see how he visibly perked up and seemed a lot more lively at this mention. I throw this in with the fact that he often finds himself in the company of Richard Dawkins and other anti-religion freethinkers, and it makes me think that he is seriously moving into this area himself.

What actually sparked this whole little discussion about Pinker and his NYT article was that a colleague mentioned he was reading The Blank Slate and that it gave very scientific reasons for much religious thinking, so perhaps Pinker has actually been there all along and I just need to read more, which is never a bad thing! :)

Monday, 01 September 2008

Understanding our minds: Fear as an example of cognitive biases at play




In my last post, I made mention of how some people's lives are ruled by fear. I briefly talked about how their (and everyone elses') choices are made without rationally weighing up the reality of the situation. Obviously, I am not advocating that humans should aspire to being purely rational beings, as in doing so, we would likely lose some of what makes us human. However, what I am saying is that we need to be aware of how often we distort the truth, especially when confronted with fearful situations.

In stressful or fearful situations, we are more likely to throw reason out the window and react with our gut feelings. Put another way, we are more likely to fall back on in-built biases (a.k.a. heuristics) in our psyche as they are so innate to who we are. Examples of this might be taking the advice of a person in a perceived position of power over the advice of someone else, or using the crowd's actions as an indication for what we should be doing (see herd instinct).

When a crisis hits, whether real or imagined (see moral panics), we are more likely to fall back on these "rules of thumb". These rules, or heuristics, are some of the first we learn, and to some extent, have probably been ingrained in us through evolutionary processes, as they serve to protect us in a dangerous world where we have limited information. For example, in a situation in the past, if I saw a crowd of people (say, my tribe), running in the opposite direction to myself, I could take this as a pretty clear indication that I did not want to find out what they were runninng away from. You wouldn't even need to think twice to make the decision to turn around and run away with them. Situations of this nature, repeated over millenia, likely helped remove those brave souls who steadfastly continued on their path towards danger rather than joining the group and running away (today we give these folks Darwin Awards). Perhaps this gives us an indication as to how our sheep-like tendencies formed?

Similarly, we have been taught since birth to accept the advice or do what we are told when instruction comes from a "higher authority". Most of us wouldn't be here today if we hadn't listened to our parents and teachers when they told us not to eat that bright purple rat poison (okay, an extreme example), or look both ways before crossing the road. At the most simple and earliest stages of our lives, we rely upon our parents absolutely, and this teaches us to listen to authority. I am not pointing this out in an anarchist sense (i.e. we need to rebel) or in a conspiracy theory-sense (i.e. we are all being mind controlled). That's just the way it is, and it is a basic rule of thumb that we carry with us throughout our lives, and is generally employed at a sub-conscious level.

Anyway, the point of this is that people are more likely to throw reason out the window when confronted with fearful or stressful situations, and there is a very real explanation for this from evolutionary psychology and cognitive perspectives (see cognitive psychology and computational theory of mind).

Daniel Gardner has written a book about this subject (I haven't read it though), called The Science of Fear (there's no Wiki on the book unfortunately). Here's the marketing blurb on the book:

"From terror attacks to the war on terror, real estate bubbles to the price of oil, sexual predators to poisoned food from China, our list of fears is ever-growing. And yet, we are the safest and healthiest humans in history. Irrational fear seems to be taking over, often with tragic results. For example, in the months after 9/11, when people decided to drive instead of fly -- believing they were avoiding risk -- road deaths rose by more than 1,500.In this fascinating, lucid, and thoroughly entertaining examination of how humans process risk, journalist Dan Gardner had the exclusive cooperation of Paul Slovic, the world renowned risk-science pioneer, as he reveals how our hunter gatherer brains struggle to make sense of a world utterly unlike the one that made them. Filled with illuminating real world examples, interviews with experts, and fast-paced, lean storytelling, The Science of Fear shows why it is truer than ever that the worst thing we have to fear is fear itself."

Here's an extract from one of the Amazon reviews, which captures much of what I am saying or touched on in the previous post:

"Gardner is eager to have us understand how these Systems work. He contends that we are carrying a reaction system founded on our ancestors' time on the African savannah. Our brains haven't adapted to the fast-paced, high technology world around us. We are reacting almost entirely with The Gut, and we are making serious mistakes as a result. Are we truly under threat from the things we claim to fear? He cites numerous cases, from the fear of "man-made" chemicals through the spectre of cancer to the possibility of our children being assaulted by strangers. Each of the topics is introduced with our given views - usually captured by polls, then carefully assessed by examining the real odds. In every case, the important things to consider almost certainly haven't been. The breast cancer campaigns have uniformly overlooked the role of age in determining the likelihood of its occurrence. 

The calculations leave little doubt that we are far too often looking at threats with little consideration of their true nature. Why are we reacting so readily with The Gut instead of with The Head? In no small part, Gardner argues, media, politicians and industry play a significant part. Media, anxious to sell its products, emphasizes the violent, the extreme and the bizarre. The result, of course, is that's what captures our attention. The bombardment of such stories, often unthinkingly repeated by politicians, is a reinforcement of The Gut's reaction to this kind of information. Never seeing a rational analysis of such news, we lose any sense of proportion about what is truly important. We rarely find the opportunity to consider an issue rationally before the next one is upon us."

I would imagine that Naomi Klein's The Shock Doctrine would be along similar lines, whether she realises it or not - hopefully she does :)