One of the great riddles of behavioural finance is why, despite vast amounts of research on psychological biases, so few people are able to overcome them. After all, four individuals have won the Nobel Prize for economics in this or related areas – Daniel Kahneman, Robert Shiller, Vernon Smith and Richard Thaler – and the volume of research on biases is huge. In contrast, only a tiny amount of psychological literature is devoted to debiasing.
Many of the biases to which humans are prone have become better known thanks to the boom in behavioural finance. They include anchoring (letting initial perceptions colour one’s judgement), confirmation bias (selectively sifting for evidence to support a particular view while ignoring contradictory evidence), overconfidence (people being too certain about the outcomes of the decisions they are making) and many more (see panel).
Each of these biases has consequences for everyday situations but they have particular applications to finance. It is easy to make investment decisions on the basis of such faulty reasoning. Few can reasonably claim to be free of all such biases. Under such circumstances it might be expected that a great deal of effort would go into debiasing but that is not the case.
Madan Pillutla, professor of organisational behaviour at the London Business School, explains the lack of discussion of debiasing in terms of its limited success so far. If researchers had proved more successful in finding ways to overcome biases they would no doubt have won plaudits for their achievements.
In broad terms, he points to two sets of reasons why biases are so hard to overcome. The first is that many people have not been trained properly. “If I don’t know how to do something, it’s going to be trial and error and I’m going to make a lot of mistakes.” Such errors are easier to overcome.
Having said that, Pillutla argues that there is a more fundamental reason why training has its limits. “The brain is structured to makes use of certain ‘heuristics’ or rules of thumb,” he says. The brain has evolved in a particular way but modern life is different. Short cuts often provide the best solution to problems but sometimes they go awry.
Bias – deviation from an objective standard
● Association based errors – caused by automatic processes that underlie the accessibility of information in memory (for example, hindsight, over-confidence, representativeness)
● Psychophysically-based errors – caused by non-linear mapping of stimuli into psychological responses (status quo, anchoring)
● Strategy-based errors – caused by use of inferior strategies or decision rules (insensitivity to sample size, assessment of covariation)
In a life and death conflict, for example, there is seldom time for a considered response. Quick decision-making is essential. But there are many problems in the modern world which could benefit from a more reasoned reaction. These range from medical decision-making for both doctors and patients – which is often prone to faulty reasoning – to portfolio management. Such areas are not easily susceptible to change.
Pillutla is at pains to emphasise he is not denigrating human reasoning. “Human minds are brilliant,” he says. “But the rational model in some sense is slightly impoverished. It doesn’t take into account the fact that there are many instances where a quick decision is probably better than any kind of deep thought and analysis.”
Some might counter that what is sometimes called ‘nudge’ or ‘choice architecture’ provides a reasonable response. Perhaps the best-known example in the financial sphere is auto-enrolment. Investors are enrolled in the default option unless they opt for an alternative.
However, Pillutla argues that nudging is not a form of debiasing. On the contrary, it is a way of using a human bias – in this case towards favouring the status quo – to achieve what the choice architect assumes is the best outcome. “A nudge uses biases to guide people towards what the nudger thinks is the right decision,” he says.
In addition, there are many other benefits to debiasing. If pursued properly it should facilitate better judgement.
● Incentives and accountability. Giving people a stake in making the right decisions is often assumed to improve decision-making. However, Pillutla says psychological experiments show that those who are highly incentivised do not, in most cases, make better decisions than those who are not. The same is true for those who are held accountable for their actions. Simple errors are an exception. For example, people can be taught to avoid mistakes such as insensitivity to the importance of sample size.
● Training and education. This is not just a matter of raising awareness but of teaching people to understand how biases work. For example, it is possible to give people feedback on the errors they make. But once again it tends to work for simple errors rather than more complex biases. Education tends to work in specific domains – for instance, if people work in particular professions – but it tends not to have an effect outside of those areas.
● ‘Frugal heuristics’ or ‘ecological rationality’. Pillutla argues that form of debiasing should get much more attention. Unlike the other forms of debiasing it can help overcome complex errors. The core principle is to present problems in as specific a way as possible. “The more concrete you are in the way you present a problem the more people are able to answer correctly,” he says. For example, he says people are notoriously poor at assessing problems involving probabilities (the probability of contracting typhoid is 0.3%). One way round this is to present information in the form of frequencies instead (out of 10,000 people, 30 have typhoid). That is despite the fact that these are different ways of presenting identical data. Training people to convert probabilities into frequencies leads to much better results. It also helps them solve problems outside of their particular domain.
Given the newness of this area, there is much work to be done on how best to apply it to investment and pensions. But if it can facilitate better decision-making in the industry and beyond, it is hard to see the public not welcoming it.