How you actually make decisions will surprise you: The nonconscious influence of cognitive biases

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
… Daniel Kahneman1

Table of Contents

System 1 and System 2 (a.k.a. the “elephant” and the “rider”)

How you actually make decisions will surprise you. The vast majority of intuitive decisions we make everyday as individuals and as teams are driven by nonconscious mental and emotional processes. Unless we have the courage and humility to learn about, leverage, and mitigate these processes to make better decisions, we will be vulnerable to “our almost unlimited ability to ignore our ignorance.”

What is a decision?2

“A decision is a choice between two or more alternatives that involves an irrevocable allocation of resources.” One of the most important decisions we make daily is how we spend our time. Every commitment of time has an opportunity cost—the lost benefits of not having chosen a better use of our time.

From science, we know that team deliberative decision-making improves with cognitive diversity and constructive conflict. However, to improve further, we must also understand how we naturally make decisions. It turns out that the vast majority of our daily decisions are made using intuition (our “gut”). As we will learn, intuitive decision-making is fraught with traps and can lead to poor decisions with high-stake consequences—even death!3

Understanding how cognitive biases affect decisions as individuals and as teams is essential to improving intuitive and deliberative decision-making. In 2002, psychologist Daniel Kahneman won the Nobel Prize in Economics, and in 2011 published Thinking, Fast and Slow1 that summarizes his research in cognitive biases and heuristics that effect our judgment, choices, and behaviors. His contributions helped to grow the exciting field of behavioral economics.4

Cognitive biases affect our perceptions, judgments, emotions, decisions, and actions. These biases are sometimes called “effects,” “traps,” or “pitfalls.” Our brain uses two mental processes called System 1 and System 2. System 1 is the fast, automatic, emotional (“hot”), nonconscious process that drives intuitive (“gut”) decisions, and System 2 is the slow, reflective, rational, conscious process that enables deliberative decisions. From an evolution lens, System 1 is primitive and based on the principle of “What You Sense Is All There Is” (WYSIATI). It has the enormous capacity to process data via our senses and automatically respond with a classification and/or an emotion (a physiological response) that is handed off to System 2 for action or further processing (i.e., reflection, deliberation, impulse control, decision-making).

System 1 uses mental models (schemas) that have accumulated over time (e.g., racial stereotypes) and is prone to errors, especially when the data is novel or ambiguous. System 1 is home to our innate processes (human drives, personality traits, emotions, fight-flee-freeze response5), and acquired mental and motor habits, implicit biases, and addictions.

In their bestselling book Switch: How to change when change is hard, Chip and Dan Heath popularized System 1 and System 2 using the metaphor of an elephant and a rider (Figure 1).6

The rider (System 2) attempts to understand, control, train, and direct the elephant (System 1) (source: https://medium.com/@ptvan).

System 2 is the “rider” and System 1 is the “elephant”—it never forgets! If the rider had optimal childhood, adolescent, and young adult neurodevelopment, it would exhibit good executive function:

  1. attention control,
  2. emotional regulation,
  3. impulse override, and
  4. behavioral modification.

Executive function uses working memory, reflection, learning, problem-solving, planning, and strategic decision-making. Fortunately, when executive function is intact and mature, the rider can control and train the elephant, albeit with a lot of effort. An exhausted System 2 (rider) is susceptible to System 1 (elephant) impulses.

For example, we all know of current national leaders whose gender and racial biases, and inability to focus attention, regulate emotions, override impulses (poor decisions and actions), and modify behaviors, in spite of adverse consequences to self, family, and nation, can all be understood and explained by poor executive function.

We must understand System 1 and System 2 (Table 1) in order to design, deploy, and improve our intuitive and deliberative decision-making, trauma-informed systems, NewSmart and Cultural Humility, change management strategies, trust building, conflict management, and lean management.

Table 1: Comparison of System 1 versus System 2

System 1 (“elephant”) System 2 (“rider”)
Nonconscious Conscious
Fast, parallel processing Slow, serial processing
Automatic Controlled
Associative Rules-based (reasoning)
Intuitive (“gut”) Reflective (deliberative)
Energy efficient Energy hog (exhausting)
Implicit knowledge Explicit knowledge
Not linked to language Linked to language
Uses stored memory (schemas) Uses working memory
Emotional (“fight-flee-freeze”) Rational

Cognitive biases in decision making

From NewSmart Humility we learned about natural human defensiveness from the drive to protect our ego7 and avoid our fears (vulnerability, uncertainty, risk, intellectual or emotional exposure, uninvited scrutiny). Fear (“fight-flee-freeze” response) is generated by System 1. Now we cover cognitive biases and traps that involve the interaction of System 1 and System 2. Spetzler, et. al have clustered the most important of these into six categories relevant to decision-making (Figure 2). To date, more than 200 cognitive biases have been identified.8

phd
Classification of cognitive biases affecting perceptions, judgments, emotions, decisions, and behaviors.

1. Protection of mindset

Mindsets are “all the stuff in our heads: beliefs, mental models of reality, lessons learned, memories, preferences, prejudices, and unconscious assumptions. We use these to make sense of the world and to make judgments and decisions. Whenever we encounter something that conflicts with our mindset, the first impulse is to reject or attack it, as an antibody would attack an alien organism.”9

System 1 and System 2 team up to protect our mindsets using the following cognitive biases:

  1. avoiding dissonance,
  2. confirmation bias,
  3. overconfidence,
  4. hindsight bias,
  5. self-serving bias,
  6. status quo bias, and
  7. sunk cost bias.

Whenever we sense data that conflicts with a mindset, we experience a discomfort psychologists call cognitive dissonance. Our mind cannot sustain dissonance; therefore, we mitigate it by ignoring, discrediting, or explaining away the data. Accepting the data would require changing our mindset which is difficult because we seek out data that confirms our mindset (confirmation bias) and we avoid data that challenges it (avoiding dissonance).

Humans overestimate their capabilities (overconfidence). We are all “Monday morning quarterbacks” (hindsight bias). We give more weight to our positive qualities than our negative qualities (self-serving bias). We attribute “successes to our efforts while writing off failures to bad luck or situational factors.”9

With the status quo bias “we stubbornly cling to the current position, technology, or … strategy and for too long—and even escalate our commitment to it despite evidence that it’s not working, in the hopes that things will improve.”9 In a variant, the \emph{sunk cost bias}, we decide to continue the current course because we have already invested large resources (money, staff, and time) and not because it is the best choice using objective criteria.

2. Personality and habits

“Another critical source of decision bias is our collection of habits and the personality characteristics that create them.”9 A habit is a mental and/or motor process that becomes automatic (System 1) and its origin can be from System 1 (nonconscious) or System 2 (through intentional practice). When we are aware of a habit, we can control or change it (System 2) but only with significant effort. Mental habits can influence our decision making.

Personality refers to individual differences in characteristic patterns of thinking, feeling and behaving.10 Several frameworks exists to explain personality.11 To understand intuitive decisions we use the popular Myers-Briggs Type Indicator (MBTI) personality inventory based on Carl G. Jung’s theory of psychological types. The MBTI has four binary dimensions:

  1. Favorite world: Do you prefer to focus on the outer world or on your own inner world? This is called Extraversion (E) or Introversion (I).
  2. Information: Do you prefer to focus on the basic information you take in or do you prefer to interpret and add meaning? This is called Sensing (S) or Intuition (N).
  3. Decisions: When making decisions, do you prefer to first look at logic and consistency or first look at the people and special circumstances? This is called Thinking (T) or Feeling (F).
  4. Structure: In dealing with the outside world, do you prefer to get things decided (“convergent” thinking) or do you prefer to stay open to new information and options (“divergent” thinking)? This is called Judging (J) or Perceiving (P).

Figure 3 graphically depicts the four dimensions. For detailed descriptions of each see footnote URL.12

phd
Myers-Briggs Type Indicator (MBTI) inventory.

Extroverts are energized by engaging the outside world (“thinking out loud”), Introverts are energized by engaging their thoughts. Sensing-types prefer information that is concrete and self-evident. Intuition-types prefer information that is nuanced, conceptual, and high-level. Thinking-types like to make decisions using logical reasoning. Feeling-types like to make decisions focused on people’s feelings. Perceiving-types delay decision-making to keep options open and to collect more information. Judging-types accelerate decision-making focused on action over deliberation.

MBTI captures strong cognitive preferences that drive decisions, behaviors, and habit-formation. Personality type is like being right-handed: we can write with our left hand, but we strongly prefer to write with our right hand. From a self-administered survey, a person will be assigned four letters; for example, ENTJ. Your MBTI changes little over your adult life. Differences in personality-types can lead to poor communication, misunderstanding, and conflict.

Understanding personality-type is critical for (a) understanding that extroverts tend to speak out and get heard, while introverts need time to gather and write ideas; (b) designing communication strategies, taking into account people’s preferences for receiving information; (c) understanding our preferences for intuitive decision-making; e.g., decisions made by Thinking-types may come across as cold and heartless (think Mr. Spock!); and (d) designing decision processes that diverge (consider many creative options) and that converge (make a decision), and not get stuck in one personal preference style (i.e., perceiving vs. judging).

Personality type can lead to the following cognitive biases: (a) preference-based habits, (b) habitual frames (c) content selectivity bias, and (d) decision styles. Similar to learning how to do things with your dominant hand, your personality will shape your thinking and doing habits (preference-based habits)—and it’s very hard to change! Sensing-types prefer narrow decision frames, and Intuition-types prefer expansive decision frames (habitual frames). Feeling-types are biased toward information about peoples emotions; Thinking-types are biased toward information that is objective and measurable. In decision-making, Extroverts want to openly deliberate and introverts prefer to think and write (decision style). Well designed decision processes engage diverse personality types, ensuring balance and closure (i.e., divergence followed by convergence).

3. Faulty reasoning

System 1 processes raw data using our senses. In contrast, reasoning is a System 2, logical, deliberative process that analyzes data, and manages, synthesizes, and translates knowledge to draw inferences (conclusions), and to inform or influence decision making. The major cognitive threats to sound reasoning are complexity and uncertainty.

Faulty reasoning due to complexity:

Complexity (complex systems) involves entities (people or processes) that are diverse, connected, interdependent, and adapting. Complex systems, especially involving people, are dynamic, ambiguous, and unpredictable. In spite of our best intentions we are susceptible to these cognitive biases: selective attention, inability to combine many cues reliably, substitution heuristic, and order effects,

“The human mind is confused by multi-dimensional problems and loads of data. In response, we often oversimplify. We apply selective attention to the variables that seem most important while ignoring the rest. In situations where many value dimensions are important, we still end up focusing on just a few key attributes because of our inability to combine many cues reliably. We use a substitution heuristic to shift attention from a tough question (“How much effort should we spend on this decision?“) to an easier one (“How much time do we have before the next executive committee meeting?“), even though the answer to the easier question may have very little to do with the question that we really need to answer. When face with many different pieces of information, another trap, based on “order effects”, leads us to remember those ideas that are either first or last. In general, when things get complicated, we oversimplify, whether we realize it or not.”9

Faulty reasoning about uncertainty:

“Uncertainty—always an element in big, difficult decisions—confounds the mind’s reasoning capacity. Even highly trained professionals make mistakes when they have to reason through uncertain situations”.9 In public health and medicine we use probability theory and Bayes theorem to mitigate confusion about uncertainty. Unfortunately, even for the simplest scenarios, no human brain’s System 2 is capable of calculating posterior probabilities given prior probabilities and performance characteristics (e.g., sensitivity and specificity of diagnostic tests). At a population level intuition was “good enough” for evolutionary competition between species; however, today we must reach for System 3—methods, tools, and experts—to navigate uncertainty. First and foremost, this requires humility—NewSmart Humility. We cannot overcome our limitations if we do not acknowledge them and commit to improving.

To tackle complexity and uncertainty today we turn to design thinking and data science to analyze, synthesize, simulate, and optimize inputs, outputs, and outcomes; to gain insights that exercise our intuition; and to deliver customer value. No individual has all the expertise and experience for this challenge. Culturally diverse, transdisciplinary teams is the only way to go!

4. Automatic associations

Our nonconscious mind automatically judges data to be more important or probable if they are recent, vivid, readily available, or coherent. The converse is true: data that are not recent, vivid, readily available, or coherent are judged to be less important or probable. This leads to a group of related cognitive biases: (a) ease of recall, (b) availability effects, (c) vividness bias, and (d) narrative fallacy. In the narrative fallacy, a believable good story is judged to be more important or probable, even if it is not.

In the halo effect an entity is judged to be important if it is associated with someone or something that is already considered to be important. For example, politicians like to be photographed with popular movie stars or sports figures because of their halo effects.

When we estimate uncertain quantities, our estimate can be influenced, nonconsciously, by exposure to recent, unrelated numerical data (anchoring effects). The greater the uncertainty of the estimate, the greater the anchoring effect bias. Because these automatic associations happened nonconsciously we are completely unaware of these influences.

5. Relative thinking

How we frame an issue effects how our brain perceives it (framing effects). For example, if a doctor informs their patient that a proposed surgery has a 95% chance of survival, the patient will perceive the risk very differently than if the doctor had informed the patient that the proposed surgery has a 5% chance of death. Framing effects are common; however, we cannot predict the magnitude or direction of the effects, so we must run experiments and learn.

Studies show that people will travel an extra 15 minutes to pay $10 for an item rather than pay $15, saving 30%. However, they are unwilling to travel an extra 15 minutes to pay $100 for an item rather than $105, saving 4.8%. Why? In both cases one can save $5, there should be no difference. But there is, and it’s called the reference point effect. Consider how spending an extra $5000 feels when buying $25,000 car versus spending an extra $5000 when buying a $1,000,000 home.

Going outside when in freezing temperature and snow feels very different if the context is your winter vacation and you love skiing, or if the context is commuting to work (context effects).

6. Social influences

Humans are social creatures and we want to be liked, valued, and respected. We change our behaviors to “fit in” (conformity). When we need answers to a problem we are susceptible to accepting suggestions without too much scrutiny (suggestibility), especially if it’s from a source we “trust.” Like rumors, suggestions can quickly spread through a group (cascades). At work, teams are averse to conflict so they “go along to get along”—also known as groupthink.

Mitigating cognitive biases

To commit the biases to memory (“mindware”) remember $\text{SP}^2\text{AR}^2$: Social influence, Protection of mindset, Personality and habits, Automatic associations, faulty Reasoning, and Relative effects. Embrace NewSmart Humility! Be humble! Be mindful! Be reflective! Experiment! Learn! Use System 3 tools and experts.

Footnotes


  1. Kahneman D (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux. Amazon: http://a.co/dBMfrFv ^
  2. Howard RA, Abbas AE (2015). Foundations of Decision Analysis. 1st ed. Pearson. ^
  3. See Michael A. Roberto (2002). High stakes decision making: The lessons of Mount Everest. HBS Working Knowledge: Business Research for Business Leaders. Available from http://hbswk.hbs.edu/item/3074.html ^
  4. The study of the effects of psychological, social, cognitive, and emotional factors on the economic decisions and behaviors. ^
  5. Also called the “fight-or-flight” response. ^
  6. Heath C, Heath D. Switch: How to change things when change is hard. New York: Broadway Books; 2010. ^
  7. Ego is best understood as “self-concept” which is a collection of beliefs about oneself, including our many identities (gender, racial, professional, etc.) “Self-concept is made up of one’s self-schemas, and interacts with self-esteem, self-knowledge, and the social self to form the self as whole. … The temporal self-appraisal theory argues that people have a tendency to maintain a positive self-evaluation by distancing themselves from their negative self and paying more attention to their positive one.” (For more information see Wikipedia) ^
  8. https://en.wikipedia.org/wiki/List_of_cognitive_biases ^
  9. Spetzler C, Winter H (2016), Meyer J. Decision Quality: Value creation from better business decisions. Wiley; 1st Ed. Amazon: http://a.co/2qF0Ozl ^
  10. Source: http://www.apa.org/topics/personality/ ^
  11. Mayer JD. The personality systems framework: Current theory and development. J of Research in Personality. 2015 June;56:4–14. ^
  12. See http://www.myersbriggs.org/my-mbti-personality-type/mbti-basics/ ^

Related

Next
Previous
comments powered by Disqus