When gut feelings are better than careful analysis
Years ago, a leading American teaching hospital admitted a 21-month-old boy we'll call Kevin.
A young doctor took charge of his case. He hated having to draw blood from Kevin's emaciated body and noticed the boy refused to eat after being poked with needles. Intuitively, he kept invasive testing to the minimum and instead tried to provide the boy with a caring environment. Kevin began to eat and his condition improved.
But the young doctor's superiors didn't approve of his unconventional efforts. So a host of specialists, each interested in applying a particular diagnostic technology, set out to find the cause of the boy's illness. If he dies without a diagnosis, we've failed, they reasoned.
Over the next nine weeks, Kevin was subjected to batteries of tests, which revealed nothing decisive. He stopped eating again, so the specialists sought to counter the combined effects of infection, starvation and testing with intravenous nutrition lines and blood transfusions.
But Kevin died before his next scheduled test. The doctors continued testing at the autopsy, hoping to find the hidden cause. One doctor commented: "Why, at one time he had three IV drips going at once! He was spared no test to find out what was really going on. He died in spite of everything we did!"
That story is told by a distinguished German psychologist, Gerd Gigerenzer, of the Max Planck Institute, in what many academics would call his hugely "counter-intuitive" book, Gut Feelings.
But here's the trick: what university-trained people are encouraged to regard as "intuitive" isn't intuitive at all. It's what all their learning has led them to believe is the right way to think or act. In this, academic sense of the word, it was the specialists who were acting intuitively: their training told them they couldn't begin to help the boy until they'd first correctly diagnosed his problem.
Thanks to this way of thinking, they tested him until their actions helped to kill him. But the way Gigerenzer uses the word, it was the young doctor who acted on his intuition, casting his professional training aside and trusting his gut feelings.
Gigerenzer's point? In this particular case, the young doctor was right to trust his instinct and his better-trained and more experienced superiors were led astray by all their learning.
What's more, he claims, cases where relying on your gut feelings rather than on careful analysis leads to better decisions are surprisingly common.
But such a conclusion - itself based on Gigerenzer's scientific (if controversial) research - is, in the academic sense of the term, hugely counter-intuitive. It's
the opposite of what educated people
It's a mistake to imagine only economic rationalists are on about rationality. Ever since the Enlightenment of the 17th and 18th centuries, virtually all university teaching has stressed the need for
reasoned, logical analysis. You make decisions by gathering all the relevant information you can, then weighing it up carefully and logically.
Economic rationalists assume that's the way we really do make decisions. But American psychologist Daniel Kahneman - whose life's work is beautifully summarised in his book Thinking, Fast and Slow - won the Nobel prize in economics for demonstrating that the vast majority of the decisions we make are made unconsciously, instantaneously and instinctively.
Kahneman showed that these unconscious, snap decisions are based on deeply ingrained mental short-cuts, or
rules of thumb, which psychologists call "heuristics".
He further argued that a lot of these heuristics are illogical and so cause us to make many bad decisions. This is the basis for the title of the well-known book by behavioural economist Dan Ariely, Predictably Irrational.
But this is where Gigerenzer begs to differ. He argues that in many, but not all circumstances, the heuristics we use lead to good decisions - better decisions than
we would make if we took the time to gather more information and think the decision through.
And this is true even though many heuristics seem to the educated mind
to be illogical.
Why? Because we often must make decisions almost instantly, because deliberation can get in the way of our unconscious motor skills, because gathering information has costs (not all of which
are monetary), because the future is uncertain no matter how much we know about the past, and because of our "cognitive limitations" - too much information confuses us and makes us indecisive. What's more, some information can mislead us, containing "more noise than signal".
Gigerenzer's research contradicts two core beliefs of economists and other rationalists: more information is always better and more choice is always better. Rather than building complex decision-making systems that take account of as many factors as possible, we should search for "fast and frugal" decision rules that are shown to work most of the time. Spending less time on some decisions can actually improve them.
Relying on intuition or gut feelings isn't acting on impulse or caprice. This is because our brain's use of its intelligence isn't necessarily conscious or deliberate.
"The intelligence of the unconscious
is in knowing, without thinking, which rule is likely to work in which situation," Gigerenzer's says. "What seem to be 'limitations' of the mind can actually be
The logic-based approach to decision-making "assumes that minds function like calculating machines and ignores our evolved capacities, including cognitive abilities and social instincts. Yet these capacities come for free and enable fast and simple solutions for complex problems ...
"Logic and related deliberate systems have monopolised the Western philosophy of the mind for too long. Yet logic is only one of many useful tools the mind can acquire. The mind, in my view, can be seen as an adaptive toolbox with genetically, culturally and individually created and transmitted rules of thumb," he concludes.
Don't get Gigerenzer wrong. His line of argument is in no way anti-intellectual. Rather, he's used his intellect and the scientific method to challenge conventional thinking about how our intellect works.