Let’s start with a couple of short riddles:
- What question can you never answer “yes” to?
- Which word does not belong in the following list: stop cop mop chop prop or crop?
[The answers appear at the end of this article.]
Riddles are designed to make us think beyond the obvious answer. There is usually no immediately accessible response, so they force us to consider how we are being misled and design a mental strategy to get to the correct response. The way we approach riddles is similar to the way we approach other mental challenges.
2 Modes of Thinking
In his book Thinking, Fast and Slow, the economist and psychologist Daniel Kahneman, PhD, defines two types of thought, which he calls System 1 and System 2.1 System 1 is our fast-thinking brain. We live the vast majority of our life in System 1, which allows us to swiftly respond to threats and most routine situations. Our distant forebears had to make important decisions that could determine if they lived and got to pass on their genes. For these early humans, survival favored the assumption of danger and loss. Thinking slowly could kill you.
Long ago, if a person heard rustling in the bushes he might conclude, based on past experience, that the rustling was just the wind. Although the assumption might be correct the majority of the time, if the rustling turned out to be a saber-toothed tiger even once, the cost of this false negative (a type 2 statistical error) is not good. If, instead, he assumed a tiger was in the underbrush and climbed the nearest tree every time he heard a rustling sound, but it turned out to be the wind—avoiding a loss or a false positive or type 1 statistical error—he lived to pass on his genes.
Human beings were unlikely to survive many false negative errors and still become ancestors of present-day humans. Thus, evolutionary psychology and survival has been dependent on a loss counting more than a gain. To demonstrate this central human behavior, Dr. Kahneman gives many examples in which rational human beings make decisions to avoid a perceived loss, even when the loss value is less than the potential upside. He published his findings in a paper on decision theory and received the Nobel Prize in Economics in 2002.2
There is more. Dr. Kahneman states that System 1 immediately engages our memory of recent events and context—your lived experience or availability heuristic—to provide us with a solution to a problem. Unfortunately, System 1 only remembers the winning brain interpretations and not any alternative considerations, so our immediate memory is not complete. In Dr. Kahneman’s words, “conscious doubt is not in the repertoire of System 1” because the implications of doubt require both mental energy and time. Both of those qualities are the domain of System 2. It is the system you used to try to solve the riddles.
Dr. Kahneman uses words like lazy and indolent to describe System 2. We use System 2 when solutions to problems are not immediately available. He calls System 1 gullible and biased to believe, whereas System 2 is in charge of doubting and unbelieving. He uses the acronym WYSIATI (what you see is all there is) to describe System 1’s superficial reasoning.
When I read the piece by Philip Seo, MD, in the September 2021 issue of The Rheumatologist, it became apparent to me that the conspiracy believers described were not engaging System 2.3
But Dr. Kahneman also makes it clear that System 2 is not foolproof. It is dependent on experience, education and open-mindedness, as well as what he calls general mental ability, or GMA. System 2 can be used to substantiate and buttress a System 1 conclusion. Dr. Kahneman indicates that System 2 can be susceptible to the biasing influence of so-called anchors, which make information easier to retrieve. Although anchors help streamline brain function, System 2 doesn’t consciously control what information becomes an anchor and is unaware when an anchor has influenced its judgment. And the less we know to inform a judgment, the more that judgment is susceptible to the anchoring effects of personal bias.
To illustrate the anchoring effect, Dr. Kahneman gives an example of experienced real estate agents asked to provide an ideal selling price for a house they inspect. Teams provided with an unrealistically high suggested value (the anchor) will subsequently agree on a price that is substantially higher than an equally savvy group provided with a much lower suggested valuation. Vaccine unbelievers may similarly think that unproven remedies can successfully treat COVID-19 if they repeatedly hear from an anchor source they trust that they are useful. If System 1 hears vaccines are dangerous and a particular agent is curative, it does not examine if the anchor could be faulty. And System 2 is not foolproof.
The mental effort required to operate in System 2 can actually be witnessed. To demonstrate this point, Dr. Kahneman asks the readers of Thinking, Fast and Slow to do a sequential addition problem while simultaneously maintaining a specified rhythm on a drum. If you were to film your eyes while performing the exercise, you’d see your pupils dilate with the effort.
Consider a day in the clinic. Let’s say you see 20 patients from start to finish. How often do you believe you engage System 2? After years in practice, solutions to routine problems typically reside in System 1. But experienced physicians will still have the occasional challenging problem.
Now, consider a day in which four patients present with a complex medical issue not within the readily accessible context of our experience. Four out of 20 patients would constitute a very challenging day. But what if it were 10 of 20? That day would be impossibly draining and difficult to bear. We would be engaging System 2 throughout the day, while also satisfying the documentation demands of the electronic medical record (see Figure 1); it would be like adding the numbers while maintaining the drum rhythm. You would leave the office feeling fried and depleted. After a day like this it would be hard to remember something as simple as the drive home.
How Bias & Noise Impact Judgment
Many other factors influence how we humans make decisions. In the recent book Noise: A Flaw in Human Judgment, Kahneman et al. outline many challenges to our rationality, including bias and noise.4 Noise refers to the natural propensity for different people’s judgments and decisions to vary even when presented with identical situations. Although we recognize these variations in other fields, we often don’t acknowledge that noise impacts both our personal and collective judgments. We are less consistent personally, and collectively, than we like to believe.
Picture yourself as a member of a diagnostic group reviewing a challenging medical case. There may be a variety of judgments in the room, but there can only be one correct answer. The collective decision can miss the mark. The variations from the correct response are statistical noise.
Much has been written about the potential influence of bias in peer review. To try to eliminate bias, reviewers of submissions to medical journals are asked to declare any conflicts up front. Does this requirement eliminate judgment bias if most of us are unaware of, or possibly unable to acknowledge, our biases? The answer to that question depends on whether or not the reviewers have processed their judgments through System 2, or if they have used familiar mental heuristics to reinforce bias found in System 1.
In Noise, we learn about the many different forms of bias that are routinely encountered. One is the bias of the order of opinions in a group. A strongly expressed opinion from an influential and respected individual can demonstrably affect the presumably independent judgments of others in the room (i.e., the anchoring effect) (see Figure 2), especially if that person is the first to speak.