Acting Irrationally

 

There are a number of biases that affect the way we as ‘Knowers’ process information and which prevent us from acting rationally, some of these are:

 

·         The Narrative bias

·         The Corroborative bias

·         The Experiential bias

·         The Humanisation effect

·         The Herding effect

·         Group polarisation

·         The Negative bias

 

Camus says that “Human beings are never rational, but they are consistently rationalising”. This is sometimes called ‘backwards rationalisation’, i.e. we do things first without thinking them through properly or because we are afraid, angry or otherwise emotional and then, after we have done something we come up with rationalisations or explanations or excuses for why the things we did without thinking were actually the sensible thing to do anyway.

 

 

The Narrative bias:

Creating a story around a piece of information feeds the narrative bias. Psychologists discovered that when more detail is provided (whether or not the data is specifically relevant) we are more inclined to believe the tale. This is even true when the extra detail in fact limits the probability of a certain event (Kahneman, Tvorsky).

 

Consider the following two scenarios:

 

·         A: Tom is a healthy and fit 45 year-old. He dropped dead in a supermarket.

·         B: Tom is a healthy and fit 45 year-old. He dropped dead in a supermarket, and was discovered to have suffered from a heart defect that could have killed him at any time.

 

Most choose B as more likely

 

But people do not realise that if B is ‘true’ the A is also ‘true’ because B is already contained as a possibility within A. In fact, B is less probable than A because it limits the possible reasons for Tom’s death. With A any number of causes of death are possible. With B these other possibilities are specifically excluded, yet we still tend to believe that B is more likely.

 

Why don’t we realise this? It has something to do with the way our brains process information. We can easily imagine someone dying of unknown heart condition (the explicit reason in B). In fact, we may know someone who suffered something similar, we may have read about it or it might just be something we could imagine so the presentation of the narrative frame means our minds can more easily process and store the information – so B just seems more likely, because there is extra information.

 

 

The Corroborative (pattern) bias:

Seeks to confirm what we already think we know, and dismisses data which conflicts with that knowledge

 

Consider deaths from cancer in any given year. Can we tell from a patient’s age, sex, and type of cancer whether they are a smoker or not?

 

What we think we know often leads us to false conclusions. Because we ‘know’ smokers die younger and that smokers get certain types of cancer (e.g. lung cancer) more easily we look for patterns to be reproduced. So data such as death aged 35 – 50, from lung cancer or similar, we assume points to smokers’ behaviour and we ignore those deaths that appear to indicate more natural causes – aged 80+, breast or prostate cancer or having already suffered and beaten cancer.

 

However, smokers all have at least one story of a friend or relative who smoked constantly, lived to be 100, and never contracted cancer. So, of the following patients, who is more likely to suffer (and die from) cancer?

 

·         Patient A is male, 37 years old, a factory worker, smokes 60 cigarettes a day, and has a history of minor chest complaints

·         Patient B is female, 72 years old, retired teacher, who has never smoked nor drank, and had benign growth removed last year

 

The single most relevant fact (but one which is almost ignored) is age – that has far greater determinacy in assessing cancer risk, so B is more likely to die from Cancer. But is does not conform to all we think we know, and is often the last to be considered.

 

 

The Experiential bias:

States when we remember something happening recently or as a common event we will think of it as more probable

 

Consider the specific example of Tom, the man with the undetected heart defect. As we have more experience of such a thing happening, we think it more likely to occur to others, too. If we know someone who also suffered an unexpected death because of a similar undiagnosed condition, we think the story more likely. If that knowledge is recent, we again increase the probability to compensate

 

 

The Humanisation effect:

This is best described by the chilling quotation, “One death is a tragedy, a million merely a statistic.” [Stalin]

 

We can only conceive of a million people as mass. We cannot see them as a collection of individuals with lives, feelings, families and because of this we cannot grieve for them. The brain is not made to cope with such large numbers as concepts and after a while all begin to blur together. Studies by Kahneman and Tvorsky suggest ‘blurring effect’ occurs with crowds as small as forty people.

 

So one death is a tragedy, because it is not just a death: it is an imaginable and quantifiable loss of a life (with all the history, laughter, love and sorrow that it contained). However, we cannot conceive on a million individuals in the same way so we are not affected by their lives so powerfully. This is why charity companies choose the story of one child in Africa with a sickness to alert attention to plight of many. If we are told that millions are starving and need help in some way, we often feel helpless however, one person’s suffering is something that can be tackled directly, and overcome.

 

 

The Herding effect:

Herding effect is explained by evolutionary theory. In the past man became a social animal because working together with other humans gave us a competitive advantage over stronger, quicker, larger prey. No we no longer need to hunt prey but we still have a need to ‘belong’, to fit in with the group.

 

This is best illustrated in experiments by Asch and Milgram in early 1960s, and by the infamous Stanford Prison Experiment.

Where, when others do something, even something we would normally think of as immoral such as delivering high voltage electric shocks to other people as in the Milgram experiments, we are more likely to do it ourselves, because of this continuing need to run with the herd.

 

This is also the reason why social punishments, such as ‘naming and shaming’, appear to be so effective in controlling low-level, community crime because often, fear of being stigmatised by one’s community is greater than the fear of more official sanction by the state such as time serving time in jail.

 

 

Group polarisation:

This works in tandem with the herding effect: when a particular herd is joined, members then strive to be as much like the other members as possible – they try to be like the best, stereotypical example of the group that they can be. This does not mean that everyone tries to be the same because there are lots of different herds – even the loners and individuals join the ‘loner’ herd, and feel validated by it.

 

This is best illustrated by special interest or lobby groups such as environmentalists, religious sects or campaigners where there is a competition to be the most green, the most devout, and so on. This polarisation can lead to dangerous extremes.

 

 

The Negative bias:

This is our sense of pessimism: when faced with two potential outcomes, the more negative is considered more probable and this leads us to put greater effort into guarding against negative outcomes than on working towards positive ones. Essentially we are more worried by the potential costs of getting something wrong than the possible benefits of getting it right – even when we know that the statistics may be in favour of the more positive option.

 

 

Remember these biases interact in different ways and many may work together at the same time in the same situation to have an effect on the Knower in question. Look at the ‘Risk Case Study’ on this page to see how these biases work in a real example.