A new book by Nobel economics prize winner Daniel Kahneman asks why does everyone make such bad decisions and what can we do about it?
Why do we all make such bad decisions? Not just about investing, about everything.
Imagine two doctors who give two different diagnoses to identical patients, or two judges who give completely different sentences to people with the same background who have committed the same crime. Or two economic forecasters, confronted with the same economic data, who make wildly different projections for U.S. GDP.
Or, even worse, a doctor or a judge who might give a different diagnosis or judgment depending on what time of day it is, or even what they ate.
Why does that happen? Is there a way to make better decisions? Can we make decision-making more logical? Can we remove some of the emotion that clouds our ability to make the right decisions?
Most importantly, can we find some way to make consistently better judgments?
That's the subject of "Noise," the new book by Kahneman and his colleagues Oliver Sibony and Cass R. Sunstein. Kahneman is one of the founding fathers of behavioral economics and author of the seminal work "Thinking Fast and Slow."
Kahneman and his colleagues define "noise" as "variability in judgments that should be identical," and he leaves no doubt about how he feels about it: "There is too much of it."
That "variability" comes because judgments are subjective and don't follow exact rules.
This "noise" is ever-present in our lives. Medicine, law, economic forecasting, food safety, auto repair, it doesn't matter. It's there, wherever people have to make judgments or decisions.
There is a difference between bias and noise. If you step on a bathroom scale, and every day the scale overstates your true weight by 2 pounds, that is bias.
If you step on a bathroom scale, and one day it overstates your weight by 2 pounds, and the next day understates by 1 pound, and the next overstates by 3 pounds, that is noise. (In case you were wondering, Kahneman states that "most inexpensive bathroom scales are somewhat biased and quite noisy.")
The study of bias is well-developed. Kahneman himself has made many important contributions to the field.
Perhaps the most common bias professionals exhibit is overconfidence, which Kahneman called "the most significant of the "cognitive biases" in "Thinking Fast and Slow." Overconfidence has been blamed for everything from the sinking of the Titanic to the subprime mortgage crisis of 2008.
Many other biases have been identified, including confirmation bias (using information that fits in with existing beliefs while ignoring information that doesn't fit in with those beliefs) and loss aversion (a potential loss is perceived as more severe than an equivalent potential gain, a condition well known to stock investors).
Bias, in other words, is easy to see and describe. Noise is harder to see but no less damaging.
The good news is we can do something about it.
Most people have a very high opinion of their opinions. This makes life interesting, but it's a real problem when dealing with judgments that affect other people's lives, health and money, because judges, doctors, car mechanics and financial advisors usually don't understand how biased and noisy their judgments are.
For this reason, Kahneman pushes hard for developing a more rules-based approach to decision-making.
He is aware that rules and algorithms can have biases of their own but believes that if properly constructed they are superior to human judgment. "Most people are surprised to hear that the accuracy of their predictive judgments are not only low but inferior to that of formulas," he says. "Even simple linear models built on limited data, or simple rules that can be sketched on the back of an envelope, consistently outperform human judges."
To make organizations aware of how much potential noise exists, Kahneman suggests a noise audit.
What's that? It's a way to measure how much noise there are in systems, which can be anything from a radiology department to an insurance agency to a financial services firm.
For example, if members of a radiology team provide widely different analysis of the same X-ray of the same patient, that is a noise problem.
Kahneman describes a protocol for a noise audit that involves studying how a group of experts in a firm (he suggests a minimum of 12 participants) react to two or three realistic case studies on an individual basis. Each member would have to summarize the case and the judgment either numerically (in dollars, percentiles, or probabilities) or on some scale with at least five degrees (such as "very strong," "strong," "average," "poor," or "very poor"). They would not be allowed to communicate with each other.
The executives who assemble the case study are surveyed beforehand to assess how much they expect their experts to agree on any given case, and what level of disagreement would be acceptable.
If the results diverge significantly from expectations, then there is a noise problem.
Kahneman calls his main suggestion for reducing noise "decision hygiene," a commitment to follow certainly clear procedures to reduce noise and bias.
Some principles of decision hygiene include:
- The goal of judgment is accuracy, not individual expression. Kahneman calls this "the first principle of decision hygiene." Individual differences "lead different people to form different views of the same problem. This observation leads to a conclusion that will be as unpopular as it is inescapable: judgment is not the place to express your individuality." Can we replace human judgment with rules or algorithms? Kahneman says while it may be undesirable to completely eliminate human judgment, using algorithms can improve judgments by making them less dependent on what he calls "the idiosyncrasies of one professional."
- Resist premature intuition. Professionals make very quick decisions based on past experience, which is a key source of bias and noise. "Intuition need not be banned, but it should be informed, disciplined, and delayed," Kahneman says.
- Obtain independent judgments from multiple judges, then consider aggregating those judgments. This is another well-studied bias: A group of, say, financial analysts who have an opinion on the direction of the stock market will often change their opinion after being exposed to a group expressing different opinions. Taking a large independent sample size, whether you are dealing with evaluating an X-ray, an engine problem, or the future price of stocks, will improve the precision of estimates.
Why is the future so difficult to predict? It's been well-documented that the success rate of "experts" at predicting the future is terrible, from stock picking to elections to social trends.
First, Kahneman notes that people trying to predict the future exhibit the same biases and noise as everyone else, and this limits the quality of their predictions.
Second, Kahneman says that much about the future is inherently unknowable, and the farther out we go, the more difficult it gets.
It's unknowable for two reasons because we don't have complete information, and because events occur that are unpredictable and can affect outcomes.
We don't have complete information about companies, the economy, or individuals. Events can and do occur to companies, to CEOs, to individuals, that are entirely unpredictable and affect the quality and output of the company, of the individual, and the decisions of individuals.
This should make prognosticators of political elections, of the stock market, of the future in general very humble. "None of these events and circumstances can be predicted today — not by you, not by anyone else, and not by the best predictive model in the world," Kahneman writes.
Knowing this, a rational person might wonder why it's worth bothering at all.
The answer is that the study of bias and noise is not just an academic exercise. Kahneman makes it clear that noise goes to the heart of current-day debates about justice and fairness. "It is unfair for similar situated people to be treated differently, and a system in which professional judgments are seen as inconsistent loses credibility," he says.
Kahneman is here referring to judicial decision-making, but the lessons apply to anyone who is trying to give people advice about the future, whether it is an X-ray diagnosis, giving odds on a political race, or a financial advisor picking stocks for clients.
Despite the imperfect nature of humans and the unknowability of the future, we are not helpless. We can improve our decision-making abilities.
The battle with noise and bias, and the battle with the future in general, thus boils down to a battle for credibility.