Skip to main content

Kindness + Business

How do we Really Make Decisions?

By Toby Macdonald

 

With every decision you take, every judgement you make, there is a battle in your mind - a battle between intuition and logic.

And the intuitive part of your mind is a lot more powerful than you may think.

Most of us like to think that we are capable of making rational decisions. We may at times rely on our gut instinct, but if necessary we can call on our powers of reason to arrive at a logical decision.

We like to think that our beliefs, judgements and opinions are based on solid reasoning. But we may have to think again.

Prof Daniel Kahneman, from Princeton University, started a revolution in our understanding of the human mind. It's a revolution that led to him winning a Nobel Prize.

His insight into the way our minds work springs from the mistakes that we make. Not random mistakes, but systematic errors that we all make, all the time, without realising.

Prof Kahneman and his late colleague Amos Tversky, who worked at the Hebrew University of Jerusalem and Stanford University, realised that we actually have two systems of thinking. There's the deliberate, logical part of your mind that is capable of analysing a problem and coming up with a rational answer.

This is the part of your mind that you are aware of. It's expert at solving problems, but it is slow, requires a great deal of energy, and is extremely lazy. Even the act of walking is enough to occupy most of your attentive mind.

If you are asked to solve a tricky problem while walking, you will most likely stop because your attentive mind cannot attend to both tasks at the same time. If you want to test your own ability to pay attention, try the invisible gorilla test devised by Chris Chabris, from Union College, New York, and Daniel Simons from the University of Illinois.

But then there is another system in your mind that is intuitive, fast and automatic. This fast way of thinking is incredibly powerful, but totally hidden. It is so powerful, it is actually responsible for most of the things that you say, do, think and believe.

And yet you have no idea this is happening. This system is your hidden auto-pilot, and it has a mind of its own. It is sometimes known as the stranger within.

Most of the time, our fast, intuitive mind is in control, efficiently taking charge of all the thousands of decisions we make each day. The problem comes when we allow our fast, intuitive system to make decisions that we really should pass over to our slow, logical system. This is where the mistakes creep in.

Our thinking is riddled with systematic mistakes known to psychologists as cognitive biases. And they affect everything we do. They make us spend impulsively, be overly influenced by what other people think. They affect our beliefs, our opinions, and our decisions, and we have no idea it is happening.

It may seem hard to believe, but that's because your logical, slow mind is a master at inventing a cover story. Most of the beliefs or opinions you have come from an automatic response. But then your logical mind invents a reason why you think or believe something.

According to Daniel Kahneman, "if we think that we have reasons for what we believe, that is often a mistake. Our beliefs and our wishes and our hopes are not always anchored in reasons".

Since Kahneman and Tversky first investigated this radical picture of the mind, the list of identified cognitive biases has mushroomed. The "present bias" causes us to pay attention to what is happening now, but not to worry about the future. If I offer you half a box of chocolates in a year's time, or a whole box in a year and a day, you'll probably choose to wait the extra day.

But if I offer you half a box of chocolates right now, or a whole box of chocolates tomorrow, you will most likely take half a box of chocolates now. It's the same difference, but waiting an extra day in a year's time seems insignificant. Waiting a day now seems impossible when faced with the immediate promise of chocolate.

According to Prof Dan Ariely, from Duke University in North Carolina, this is one of the most important biases: "That's the bias that causes things like overeating and smoking and texting and driving and having unprotected sex," he explains.

Confirmation bias is the tendency to look for information that confirms what we already know. It's why we tend to buy a newspaper that agrees with our views. There's the hindsight bias, the halo effect, the spotlight effect, loss aversion and the negativity bias.

This is the bias that means that negative events are far more easily remembered than positive ones. It means that for every argument you have in a relationship, you need to have five positive memories just to maintain an even keel.

The area of our lives where these cognitive biases cause most grief is anything to do with money. It was for his work in this area that Prof Kahneman was awarded the Nobel Prize - not for psychology (no such prize exists) but for economics. His insights led to a whole new branch of economics - behavioural economics.

Kahneman realised that we respond very differently to losses than to gains. We feel the pain of a loss much more than we feel the pleasure of a gain. He even worked out by how much. If you lose £10 today, you will feel the pain of the loss. But if you find some money tomorrow, you will have to find more than £20 to make up for the loss of £10. This is loss aversion, and its cumulative effect can be catastrophic.

One difficulty with the traditional economic view is that it tends to assume that we all make rational decisions. The reality seems to be very different. Behavioural economists are trying to form an economic system based on the reality of how we actually make decisions.

Dan Ariely argues that the implications of ignoring this research are catastrophic: "I'm quite certain if the regulators listened to behavioural economists early on we would have designed a very different financial system, and we wouldn't have had the incredible increase in the housing market and we wouldn't have this financial catastrophe," he says.

These biases affect us all, whether we are choosing a cup of coffee, buying a car, running an investment bank or gathering military intelligence.

So what are we to do? Dr Laurie Santos, a psychologist at Yale University, has been investigating how deep seated these biases really are. Until we know the evolutionary origins of these two systems of thinking, we won't know if we can change them.

Dr Santos taught a troop of monkeys to use money. It's called monkeynomics, and she wanted to find out whether monkeys would make the same stupid mistakes as humans. She taught the monkeys to use tokens to buy treats, and found that monkeys also show loss aversion - making the same mistakes as humans.

Her conclusion is that these biases are so deep rooted in our evolutionary past, they may be impossible to change.

"What we learn from the monkeys is that if this bias is really that old, if we really have had this strategy for the last 35 million years, simply deciding to overcome it is just not going to work. We need other ways to make ourselves avoid some of these pitfalls," she explained.

We may not be able to change ourselves, but by being aware of our cognitive limitations, we may be able to design the environment around us in a way that allows for our likely mistakes.

Dan Ariely sums it up: "We are limited, we are not perfect, we are irrational in all kinds of ways. But we can build a world that is compatible with this that gets us to make better decisions rather than worse decisions. That's my hope."

 

Read article from source: How do we really make decisions?

 


 

←  Go back                                                  Next page

MENU CLOSE