Skip to main content

Verified by Psychology Today

Heuristics

The False Dilemma: System 1 vs. System 2

We often rely on both when it comes to decision-making.

Key points

  • Dual-process perspectives pit an unconscious System 1 against the more conscious System 2.
  • While intuitively, appealing, it has significant flaws to it.
  • There is actually no clear distinction between the two systems.
  • Most decisions actually rely on a combination of intuition and conscious, effortful decision making.
Image by Gerd Altmann from Pixabay
Source: Image by Gerd Altmann from Pixabay

A lot of scientific and popular decision-making literature is couched in a dual-process model[1]. The dual-process model pits the relatively automatic, heuristic-driven, and unconscious System 1 against the more effortful, controlled, and conscious System 2[2]. The implication of the System 1/System 2 dichotomy is that fast and automatic thinking is fraught with errors, and the way to correct those errors is to be more conscious and effortful about decision-making. It is quite intuitively appealing, but unfortunately, it is, itself, fraught with errors[3].

Melnikoff and Bargh (2018) detailed many of the problems with “the mythical number 2,” most notably that many of the properties attributed to System 1 and System 2 don’t actually line up with the evidence, that dual-process theories are largely unfalsifiable, and that most of the claimed support for them is “confirmation bias at work” (p. 283). System 1 isn’t some error-prone decision-making process, and System 2 isn’t devoid of errors. Biases, motivated reasoning, and fallacious reasoning affect all decision-making, whether it is unconscious or conscious, intuition-driven or highly analytical.

Couchman et al. (2016)[4] provided some evidence that highlighted the flaws in the dual-process perspective. They studied the performance of college students on multiple-choice exams and asked students to indicate whether they were confident in their responses[5]. The more confident students were in their initial response — which researchers operationalized as the intuitive response — the more likely those responses were to be correct. When students were confident in their intuitive response, they were correct over 85% of the time, but their success rate for guessing was only a little over 50%.

Therefore, it wasn’t that students’ intuitive responses tended to be flawed — unless they lacked confidence in those responses. When the intuitive response was one they were confident in, it was substantially more likely to be accurate. When the intuitive response wasn’t one they were confident in, then their likelihood of being accurate was little better than a coin toss.

There is obviously a difference between confidence that is grounded in relevant expertise and experience and a false sense of confidence, as I mentioned when discussing the expertise bubble and as has been demonstrated in applied research in naturalistic decision making (NDM). However, the Couchman et al. (2006) results call into question the veracity of the claims made by dual-process advocates.

The False Dilemma

Much of the evidence supporting dual-process decision-making models comes from tasks designed to take a complex phenomenon and transform it into a false dilemma. These tasks allow researchers to make inferences about whether the decision was based in System 1 or System 2[6], but the sole source of evidence for these inferences is the (in)correctness of the response. Consider the following:

A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

According to dual-process researchers, if your answer to the above was 10 cents, it was because you gave the intuitive response and that was a result of System 1 thinking (because that answer is incorrect[7].). If your response was 5 cents (the correct answer), then you obviously derived that by engaging your conscious, effortful decision-making capabilities and overrode your System 1 initial response. Thankfully, System 2 swooped in to save the day.

But how do we know that if you said 10 cents, it was because you went with an intuitive response? Might it be possible that you thought consciously about it and decided that 10 cents was correct, and if so, wouldn’t that mean System 2 produced the incorrect answer? Isn’t it possible that you intuitively knew the correct answer was 5 cents, and, if so, wouldn’t that mean System 1 actually produced the correct answer and not the error?

The only response dual-process theories can offer to these questions is that we just know. In other words, if you got it correct, it must have been the result of System 2, and if you got it wrong, it must have been the result of System 1. Unfortunately, when we draw conclusions about the quality of decision-making based strictly on the outcome, that would be considered outcome bias[8].

Thus, the most popular model of decision making in contemporary theory and practice is (1) a heuristic that (2) transforms decision making into a false dilemma and (3) is largely based on inferences grounded in outcome bias (or fallacy) so that (4) the results confirm pre-existing beliefs. Unfortunately, even with all of these flaws, the dual-process perspective has become entrenched in most discussions of decision-making, especially when it comes to making better decisions.

No Clear Distinction Between Intuitive and Effortful Decision-Making

The dual-process perspective has some serious flaws, so perhaps there is a better way of conceptualizing how we go about making decisions. To that end, I mentioned previously that Gigerenzer and Gaissmaier (2011) defined a heuristic as “a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods." Because people generally have limits regarding how much information they can process, also known as bounded rationality, decision-making necessarily involves ignoring part of the information. Gigerenzer and Gaissmaier (2011) thus concluded that “there is no strict dichotomy between heuristic and nonheuristic, as strategies can ignore more or less information."

Instead, when people make decisions, whether conscious or unconscious, the accuracy of those decisions generally come down to the specific strategy used to derive a decision (e.g., pattern recognition, recall, the weighting of different information) and the amount of information considered (i.e., what information is ignored and what information is considered). The complexity of the strategy and the amount of information considered will affect the amount of time and cognitive effort required to make the decision. Effective decision-making occurs when we rely on a strategy and consider an amount of information (i.e., expend time and effort) that allows us to reach an accurate enough decision for that situation[9].

Almost every decision situation produces one or more relatively quick and cognitively frugal heuristic responses, which are influenced by both stored information from past experience and novel information available in the present situation (which I discussed in a prior post). We rely on some of the information available in the present situation as the basis for initiating possible responses[10]. If one response (we’ll call it Intuitive Response 1 or IR1) is elicited more quickly or if we have more confidence in that response, we may simply go with that response. But if we lack sufficient confidence in IR1, we may expand our information search — that is, we may collect more information before actually making a decision. This expansion of our search could cause us to become more confident in IR1, or it may cause us to select a different response, possibly IR2, IR3, or some other option we didn’t think about originally.

To put this into context, let’s consider the following situation. One day, Tom is leaving his house and notices his lawn is looking a little bare and decides to buy some grass seed[11]. He heads to the nearest lawn and garden store. He isn’t all that familiar with different types of grass seed, but as he’s perusing the various types, he notices bags of fescue, which is a type of grass seed he’s heard of. This stimulates the recognition heuristic, which leads to the following intuitive response: Purchase a bag of fescue. He’s not all that confident that fescue is what he needs or wants, though, so he expands his information search. He notices the different varieties of seed are similar prices, so that information isn’t helpful. He begins to read some of the labels, notices that fescue requires above-average watering, and decides he doesn’t want to water his grass that often. He then comes across some Kentucky Bluegrass, which requires only average watering. He doesn’t see anything else on the bag to suggest it would be an unsound purchase, so he employs the take-the-best heuristic[12] and opts to buy that bag of grass seed instead[13].

In Tom’s situation, only fescue came to mind, so there was only one intuitive response, but it could be that Tom had heard of fescue and zoysia (i.e., two intuitive responses), meaning that the recognition heuristic alone would have been useful only for discriminating between those types of grass seed that met the criterion and those that did not. Given that Tom ended up deciding based on how often it needed to be watered, zoysia requires less water than fescue, so he might have opted for the zoysia. However, zoysia grows slowly, so if that piece of information became salient to Tom, he might have ignored both intuitive responses in favor of a grass seed that required average watering but grew quickly (which might have been a simplified fast-and-frugal tree) — which would again have led him to Kentucky Bluegrass.

Let’s dissect Tom’s decision-making using the dual-process perspective. Since Tom started with an intuitive decision but ultimately made a different decision, it implies that System 1 thinking led Tom to generate his initial intuitive response(s), but that System 2 swooped in to save the day by causing Tom to make a different choice. But in Tom’s case, all he did was alter the search criteria a little bit (replacing types he’d heard of with watering frequency and then growth rate). Did he really engage in conscious, effortful decision-making? If so, how do we know? Even if he did consciously change criteria, he certainly didn’t engage in as much conscious decision-making as he could have, and he certainly didn’t peruse every bag of grass seed in order to determine the best choice. Instead, he stumbled onto what he decided were relevant enough criteria and opted for a type of seed that fit those criteria.

Scenarios like this show the flaws in the claims of the dual-process perspective. The dual-process perspective assumes a very rote, mechanized approach to decision-making that doesn’t actually apply to most decisions people make. Instead, decision situations generally elicit one or more intuitive responses, and if we are confident enough in one of those responses, we tend to opt for it. If we’re not, we tend to expand our information search until we do obtain some acceptable level of confidence. The entire process can occur nearly instantly, or it can take a lot of time; it can be a one-and-done decision, or it may unfold incrementally, and the decision-making may occur mostly unconsciously or mostly consciously.

The Implications

So, what does this all mean? Perhaps most importantly, it means that we should stop assuming that intuitive decision-making is necessarily error-prone. As Gary Klein’s work in NDM has repeatedly demonstrated, expertise and experience can yield quite valid intuitive responses. Although terms like “gut instincts” are often used to describe intuitive, that description implies that intuitive responses aren’t the result of analysis. Yet, as evidence related to the Recognition-Primed Decision (RPD) model highlights, intuitive responses — and how those responses evolve as a function of additional information — are often the result of very sophisticated cognitive capabilities. The key to effective intuitive decision making, though, is to learn to better calibrate one’s confidence in the intuitive response (i.e., to develop more refined meta-thinking skills) and to be willing to expand search strategies in lower confidence situations or based on novel information.

Relatedly, it also means we should stop assuming that more conscious and effortful decision-making is necessarily better than more heuristically-driven intuitive decision-making. There is a time and a place to engage in very planful, deliberative decision-making processes, but the success of these processes still hinges on the effectiveness of the heuristic rules we employ. When we decide to be more deliberative, slow, or cognitively intensive, we still rely extensively on heuristics, whether we recognize it or not.

References

[1] This intimation of a dual process perspective goes all the back at least to William James (1890), who postulated two different types of thinking: associative thinking and true reasoning. Wason and Evans (1974) later proposed the existence of heuristic processes and analytic processes. Kahneman later popularized the System 1/System 2 conceptualization, culminating in his book, Thinking, Fast and Slow (2011).

[2] The laundry list of descriptors that have been used to describe these two systems is extensive, as demonstrated in a table on Wikipedia.

[3] The irony of all this is that the dual-process approach is actually a heuristic, and heuristics are what the dual-process theorists argue are fraught with error.

[4] If you don’t have access to the full article, a summary of some of the results can be found as a part of a Conversation article from 2015, written by the lead author of the study.

[5] While this study employs a task like many traditional dual-process tasks (i.e., multiple-choice questions with a defined correct answer), the fact that participants should have expected some knowledge of the material in question differentiates it from more traditional laboratory-based decision-making tasks, such as the one mentioned later.

[6] They do not generally leave open the possibility it could be based on both.

[7] System 1 would also be the alleged culprit if you gave a different erroneous response.

[8] I would actually call it an outcome fallacy, as it is a logical reasoning problem, not necessarily a propensity to do so.

[9] Also known as ecological rationality.

[10] How much information we rely on will vary. Some aspects of the situation are likely to catch our attention, while others may be less salient. The information that catches our attention is what influences those heuristic responses.

[11] At this point, I will tell you I know next to nothing about grass seed. I just needed a good example (don’t ask me how this example came to mind because I have no idea). For anyone who does know grass seed and would say my descriptions are wrong, I will refer you to Scotts, which is where my information came from.

[12] Note that what is considered best was a value judgment on Tom’s part.

[13] In the alternative, he might check out a few other bags, notice that none of them require less watering than Kentucky Bluegrass, before deciding to stick with that choice.

advertisement
More from Matt Grawitch Ph.D.
More from Psychology Today