By Gerd Gigerenzer
Summary
Heuristic decision making refers to mental shortcuts or 'rules of thumb' used by individuals to make swift decisions, particularly under pressure or when there is a lack of detailed information. This type of thinking has been criticized as a shortcut, prone to bias, and “predictably irrational”. This paper by Gerd Gigerenzer and Peter Todd shows that “fast and frugal” heuristic thinking can often yield better outcomes than time-intensive, deliberate thought processes. Sometimes, less information is better.
The authors provide a startling example of applying heuristics in medicine. When a person is rushed to the hospital with heart attack symptoms, a simple 3-step checklist outperforms a 19-point statistical analysis for determining which heart attack patients to classify as high risk. These three simple yes/no decisions can be completed quickly and still accurately diagnose the patient.
The paper discusses two revolutions underway in modern science. One is the demise of the dream of certainty and rise of probability theory, a calculus of uncertainty. The second is how humans and animals make decisions with incomplete information or uncertainty. The authors propose we view the mind not as an organ capable of making complex probabilistic calculations, but one that reaches into a “toolbox” of fast and frugal heuristics.
Gigerenzer makes a point of criticizing “unbounded rationality,” which is a thought process unconstrained by time or computational costs. It likens human thought processes to Laplace's demon, which can compute every possible bit of information, making future events as certain as the past. People often claim their thinking is similarly rigorous, but behave otherwise. Even aspiring to this level of rigor is misguided, as it makes actual human cognition seem irrational by comparison. The concept of “bounded rationality” was proposed by Herbert Simon. This model embraces the limitations of the human mind and the uncertainty of its environment. This paper first presents one type of bounded rationality, called satisficing, which involves comparing every available option and selecting the best among them. Because this can involve substantial computation, Gigerener refers to a second type, fast and frugal heuristics, to be the purest form of bounded rationality.
Fast and frugal heuristics dramatically limit the amount of information considered in a decision process. Considering every detail as relevant is referred to as “overfitting.” While this can make a model better fit the data, it can reduce performance in generalization, or predicting new data. A fast and frugal analysis can be more robust, by ignoring less salient information, which often contains more noise.
Many researchers promote a dichotomy of thinking consisting of dual processes: the analytic, rule-based, mindful process versus the heuristic, associative, mindless. These have been called “fast and slow” thinking by Daniel Kahneman and Amos Tversky. Their heuristics-and-biases research program studied how heuristic decisions often deviate from calculated probabilities or logic. They uncovered many thinking fallacies and portrayed humans as prone to taking ill-chosen shortcuts and reaching bad conclusions. They expected slow, analytic processes to be superior in most cases but were surprised by how often fast heuristic methods won in real-world conditions.
Gigerenzer's approach to this research problem differs from others. Instead of judging heuristics based on how their decisions conform to logical principles, he instead measures how they correspond to real-world outcomes. His approach begins with recognizing basic building blocks of cognition, such as depth perception and face recognition. And then studying how these combine into higher order decision processes, such as making a rapid inference that a food that tastes familiar is probably safer than an unfamiliar food. Following social norms and mimicking elders are other examples of heuristics. These higher order cognitive processes are built from long-ago evolved lower-order blocks and nested and combined into new cognitive tools. The author describes this process as an “adaptive toolbox.”
This article is considers how people use facts and observations to make decisions. The authors, Gerd Gigerenzer and Peter Todd, teach that in many cases, people make better decisions with less information. This type of thinking is called “heuristics.” People also refer to heuristics asusing a “rule of thumb.” As an example, if you eat something and it tastes weird, you should spit it out! This heuristic might prevent you from eating something rotten.
The authors of a famous book called Thinking Fast and Slow say heuristic thinking is a shortcut that can often lead to bad decisions. They point out examples of situations where people's quick decisions are illogical. The alternative is to think slowly, consider all the available information, and carefully reason towards a decision. While this sounds like a great idea, Gigerenzer shows several examples of fast thinking that also results in better decisions.
One example is in hospitals. When a person is rushed in and has symptoms of a heart attack, doctors can take dozens of measurements and run this data through a computer that can analyze whether the person is at high risk or low risk. However, another option is a simple checklist that asks three yes or no questions. Amazingly, this simple method is not only faster but leads to better outcomes.
People have all kinds of different heuristics they use every day, from how they can recognize faces or perceive whether objects are nearby or far away. Some of these simple heuristics can be combined to help with more complex decisions. Each one is like a tool in a toolbox. Good thinking requires using the right tool for the job. And oftentimes the simple tools are better than fancy, complicated ones.
--------- Original ---------
This book is about fast and frugal heuristics for making decisions - how they work, and when and why they succeed. These heuristics can be seen as models of the behavior of both living organisms and artificial systems.This article by Gerd Gigerenzer and Peter Todd discusses the use of heuristics in decision making. A heuristic is sometimes called a “rule of thumb.” It is a simple rule or procedure for making decisions with only small amounts of information. The author contrasts “fast and frugal” heuristics with slow and deliberate “unbounded rationality”. For example, if you bite into a piece of fruit and it tastes weird, a heuristic tells you to spit it out right away. Slow and careful reasoning might involve gathering evidence such as what type of fruit it is, when it was bought, how long it lasts in the fridge, etc. Even if you correctly conclude that it's poisonous, it might be too late!
A famous book called Thinking Fast and Slow by Daniel Kahneman and Amos Tversky argues that heuristic thinking is biased and often leads people to take cognitive shortcuts that lead to bad decisions. They point out how people using heuristics often commit logical fallacies in everyday situations. Gigerenzer argues that heuristics shouldn't be judged by how they conform to logical principles, but how well they perform in the real world. In the case of the heuristic above, the cost of spitting out a weird tasting, but healthy piece of fruit is small compared to the cost of eating something rotten that could make you sick. In this case, it is better to be biased towards committing the first type of error: spitting out something healthy.
Gigerenzer's research shows that slow, analytical reasoning that considers all the available evidence often leads to worse decisions than fast and frugal heuristics that make quick decisions from limited information. In one example, when people are rushed to the hospital with symptoms of a heart attack, doctors can analyze their risk using either the unbounded approach or a simple heuristic. The slow analytical approach involves measuring up to 19 different risk factors and then running this data through a computerized statistical analysis. The fast heuristic method is a checklist with three yes or no questions. It turns out that the checklist is both faster and yields more accurate diagnoses.
Everyone possesses a whole toolbox of heuristics. These cognitive tools are evolved adaptations that are shared widely throughout the animal kingdom. Low-level thought processes, like facial recognition and depth perception, are often combined to make more complex heuristics. Humans are capable of learning new heuristics and combining them to make decisions in uncertain, real-world situations. It is tempting to believe that complicated, sophisticated tools are best, but this paper shows that the simplest tool often wins.