Cognitive Biases: The Field Guide to Thinking Errors You Can’t See
Most strategic failures trace to thinking errors the decision-makers never detected.
They follow five predictable patterns — profiled in the vulnerability map alongside — profiled in the vulnerability map below. You can’t outthink what you can’t see. This guide makes them visible.
Your bias exposure across five cognitive systems — each axis maps where thinking errors concentrate.
Cognitive biases are systematic thinking errors wired into every human brain — not personality flaws, not lack of intelligence, not fixable by “trying harder.” They fire automatically via System 1 — before conscious reasoning begins — and intensify under pressure, fatigue, and high stakes. Over 180 biases have been catalogued. This field guide maps the 27 most destructive — and gives you the protocols to neutralise them.
Scan Blind Spots
Your brain is hiding threats from you right now. Learn to find them before they compound into catastrophe.
Calibrate Confidence
You’re more certain than the evidence warrants. Realign subjective confidence with objective probability.
Purge Sunk Costs
The money’s gone. The time’s gone. Cut the emotional anchor and redirect resources to what still has upside.
Filter Social Noise
Consensus feels safe — and that’s exactly why it’s dangerous. Isolate signal from groupthink.
Install Circuit Breakers
Stop relying on discipline. Build automatic error-detection systems that fire when cognitive load is highest.
TLDR:10 Quick Wins. 10 Myths Busted. Cognitive Biases.
Everything below distilled into 20 cards. Debunk the myths, deploy the interventions. The full science follows after.
Why Your Brain Systematically Misleads You
Cognitive biases aren't occasional errors — they're the default operating mode of human cognition. Your brain didn't evolve for truth; it evolved for survival and reproduction. These imperatives often conflict with accurate judgment.
Made with a brain that evolved for survival on the African savanna 200,000 years ago. The cognitive shortcuts that kept your ancestors alive now systematically mislead you in modern environments where threats are abstract, feedback is delayed, and complexity exceeds intuitive comprehension.
The Evolutionary Logic of Biased Thinking
Natural selection optimized your ancestors for speed over accuracy in life-or-death situations. When rustling bushes might signal a predator, those who paused to gather comprehensive evidence became lunch. Those who jumped first — even if wrong 90% of the time — survived to pass on their genes.
"Berries from this bush are poisonous" — re-testing that hypothesis every time was fatal. Stable beliefs = survival.
CEOs ignore market signals contradicting their business model. Investors hold losing positions while collecting supportive articles.
Viscerally remembering your cousin killed by a lion = appropriate precautions. Vivid memories weighted over statistics.
Irrational fear of plane crashes (vivid news) while ignoring car accidents — despite cars being 100x more dangerous.
First information enabled quick decisions with incomplete data — better than analysis paralysis when predators were near.
First salary number anchors final settlement. Opening bid determines auction outcomes — all independent of actual value.
Ancestors who abandoned tools they invested time creating were outcompeted by those who persisted through difficulty.
Companies pour billions into failing projects because they've "already invested so much." Relationships persist despite toxicity.
The Neuroscience of Systematic Error
Cognitive biases aren't personality flaws or education gaps — they're built into brain architecture. Understanding the neural mechanisms reveals why biases are so stubborn and why debiasing requires systematic intervention rather than willpower.
- Fast, automatic, unconscious
- Parallel processing, effortless
- Emotional, pattern-matching
- Operates continuously
- Generates intuitions & first impressions
- Slow, deliberate, conscious
- Serial processing, effortful
- Logical, calculation-based
- Activated on demand
- Can override System 1 — but requires resources
System 1 operates automatically; System 2 requires activation. Most decisions default to System 1 unless you deliberately engage System 2. Under time pressure, stress, or cognitive load, System 2 resources deplete and System 1 dominates — meaning biases intensify precisely when stakes are highest. This is why the Decision Audit protocol trains System 2 activation under pressure.
Confirming evidence activates reward pathways — literally creating pleasure from supporting information (Kaplan et al., 2016). Disconfirming evidence activates the anterior insula and amygdala — regions associated with pain, disgust, and threat detection. Your brain treats contradictory evidence as a threat to be defended against rather than information to consider neutrally.
Your brain's default mode network (DMN) — active during rest and mind-wandering — constructs narratives and causal explanations from limited information (Buckner et al., 2008). The DMN excels at pattern completion: filling gaps, inferring causation, creating coherent stories from fragments. When it processes incomplete information (which is always), it doesn't flag uncertainty — it confidently fills gaps with plausible but potentially false explanations.
- Amygdala (System 1)
- Threat detection center. Activates when encountering information that contradicts beliefs, treating disconfirming evidence as a physical threat. Drives the defensive response that makes confirmation bias feel protective rather than distorting.
- Ventral Striatum (System 1)
- Reward processing hub. Releases dopamine when you encounter confirming evidence, literally making agreement feel pleasurable. This is why "I told you so" feels rewarding and why we seek information that validates existing beliefs.
- Ventromedial PFC (System 1)
- Integrates emotion into decision-making. Assigns emotional weight to options, creating "gut feelings" that bypass deliberate analysis. Damage here eliminates intuitive judgment but also eliminates many biases.
- Lateral Prefrontal Cortex (System 2)
- Executive control center. Supports working memory, logical reasoning, and cognitive inhibition. Can override System 1 biases but requires energy, motivation, and available cognitive resources. First to fail under stress or fatigue.
- Anterior Cingulate Cortex (System 2)
- Conflict monitor. Detects when System 1 intuitions conflict with System 2 analysis. Signals the need for deliberate override. Key for catching biased reasoning before it produces decisions.
- Medial Prefrontal Cortex (DMN)
- Self-referential processing and narrative construction. Active during mind-wandering, creates coherent stories from fragmentary information, producing narrative fallacy, hindsight bias, and confident false understanding.
- Posterior Cingulate Cortex (DMN)
- Memory integration and context evaluation. Combines past experiences with current situation to generate intuitive understanding. Fills memory gaps with plausible fabrications, producing rosy retrospection and false certainty.
- Temporo-parietal Junction (DMN)
- Theory of mind, understanding others' mental states. Drives fundamental attribution error by constructing character-based explanations for behavior rather than situational ones.
The Cost of Cognitive Biases
Cognitive biases aren't academic curiosities — they drive strategic failures, investment losses, medical errors, and forecasting disasters. Research quantifying the damage reveals the scale of the problem.
of strategic failures traced to preventable cognitive biases — confirmation bias, overconfidence, and sunk cost fallacy dominating.
Lovallo & Sibony, 2010 — 1,048 major business decisions analyzed
annual underperformance by retail investors vs. market indexes — loss aversion, recency bias, and overconfidence driving systematic buying-high, selling-low behavior.
Barber & Odean, 2000 — behavioral finance landmark study
preventable deaths annually in US hospitals from diagnostic errors, with cognitive biases identified as the leading contributor.
Graber et al., 2005 — systematic debiasing reduces errors 30-50%
improvement in forecasting accuracy from bias recognition training — demonstrating biases are reducible, not immutable.
Tetlock, 2005 — 20-year study, 284 experts, 28,000 predictions
Your Brain Is Working Against You — By Design
Cognitive biases aren't bugs — they're features of a brain optimized for ancestral survival, not modern accuracy. System 1 operates automatically and generates biased intuitions before System 2 can engage. The reward pathways that make confirming evidence feel good and threatening evidence feel dangerous ensure you'll defend wrong beliefs with genuine conviction. But the +30% forecasting improvement from training proves these defaults are overridable. The taxonomy that follows maps exactly where each bias operates — and what to do about it.
The Cognitive Bias Taxonomy
Three categories of systematic thinking errors — from perception to memory to social judgment — each with evidence-based debiasing protocols.
- Confirmation Bias
- Availability Bias
- Anchoring Bias
- Recency Bias
- Attentional Bias
- Framing Effect
- Hindsight Bias
- Outcome Bias
- Rosy Retrospection
- Survivorship Bias
- Peak-End Rule
- Attribution Error
- In-Group Bias
- Halo Effect
- Bandwagon Effect
- Authority Bias
- Overconfidence
- Base Rate Neglect
- Gambler's Fallacy
- Loss Aversion
- Optimism Bias
- Sunk Cost Fallacy
- Status Quo Bias
- Choice Overload
- Endowment Effect
- IKEA Effect
Biases of Attention & Perception
These biases determine what information you notice and how you interpret it — occurring before conscious reasoning begins. They corrupt the input layer of your decision-making system.
- Seek disconfirming evidence — actively search for reasons your hypothesis might be wrong
- List 3–5 failure modes before committing to any decision
- Assign a devil's advocate — or argue the opposite position yourself
- Define your exit criteria: "What would I need to see to change my mind?" — then look for it
- Seek base rates first — ask "What percentage of similar situations historically produced this outcome?"
- Use statistical reference classes rather than memorable anecdotes
- List 5 counter-examples to your initial impression before deciding
- Separate signal from salience: vivid ≠ frequent, boring ≠ unlikely
- Generate your estimate before exposure to potential anchors
- In negotiations, anchor first with an aggressive opening offer
- Consider multiple reference points — not just the first one encountered
- Stress-test with extremes: "What would I estimate if the anchor were 2× higher? 2× lower?"
- Deliberately review full history — not just recent data points
- Use structured data collection that weights time periods appropriately
- In reviews, consult notes from the entire period before forming judgment
- Ask: "Would I make the same assessment if events occurred in reverse order?"
Memory Biases
These biases distort how you encode, store, and retrieve memories — creating false certainty about past events and preventing you from learning from experience.
- Document predictions before outcomes are known — write them down with confidence levels
- Keep a decision journal with pre-outcome estimates and reasoning
- Review actual written predictions rather than trusting hindsight-distorted memory
- Ask: "What did I actually believe before I knew the answer?" — then check your records
- Evaluate decisions by information available at decision time — not by how things turned out
- Ask: "Was this the right call given what we knew then?" — separate from outcomes
- Reward good process independently of results, especially in organizations
- Review decisions in batches — one lucky outcome doesn't validate risky process
- Keep contemporaneous records — journals, notes, reviews that capture actual experiences
- Review old records before making comparisons — calibrate memory against reality
- List specific negatives from the "golden" period to restore balance
- Ask: "Am I comparing my full present to an edited highlight reel of the past?"
Social Biases
These biases affect judgment in social contexts, group settings, and when evaluating others — warping how you perceive people and their actions.
- Generate 3 situational explanations before making any character judgment about others
- Assume circumstances you're not seeing — because you rarely see full context
- Apply the same standard you'd give yourself — "What would excuse this behavior in me?"
- Seek context before judging: ask about constraints, resources, competing priorities
- Use structured evaluation criteria before knowing group membership
- Implement blind review processes where possible (resumes, proposals, code)
- Actively seek perspectives from multiple groups — especially those that feel "other"
- Ask: "Would I evaluate this differently if it came from my team vs. another?"
- Evaluate each dimension independently before forming an overall judgment
- Use structured interviews with separate scoring for each competency
- Challenge coherence narratives: "Am I rating this trait or my overall impression?"
- Blind where possible: separate the evaluator from irrelevant positive impressions
Biases Are Layered — And Each Layer Compounds the Next
Perception biases corrupt what enters your awareness. Memory biases rewrite what you store. Social biases distort how you judge people. A confirmation-biased perception, stored through hindsight bias, evaluated through the halo effect produces decisions that feel absolutely certain while being systematically wrong. Debiasing requires intervention at every layer — not just the one you happen to notice.
The Dual-Process Architecture of Bias
Your brain runs two operating systems simultaneously. One is fast, automatic, and perpetually biased. The other can correct errors — but it's lazy, slow, and easily overwhelmed.
System 1
- Fast — milliseconds to seconds
- Parallel — multiple streams simultaneously
- Associative — connects related concepts
- Effortless — doesn't deplete cognitive resources
- Always on — operates continuously without activation
System 2
- Slow — seconds to minutes
- Serial — one stream at a time
- Rule-based — follows logical principles
- Effortful — depletes cognitive resources
- Lazy by default — only activates when triggered
System 1 Acts First — Every Time
-
Frames the problem in specific terms — before you choose how to think about it
-
Retrieves accessible memories — not necessarily the most relevant ones
-
Generates emotional reactions and intuitive judgments within milliseconds
-
Primes certain associations and suppresses others — shaping what "comes to mind"
Why Smart People Aren't Less Biased
Five Conditions That Amplify Bias
The decisions where you most need accuracy are precisely where biases are strongest.
How Biases Compound Into Catastrophe
Now that you understand how dual-process architecture creates systematic errors, the next section identifies the five specific biases that cause the most damage — and the evidence-based protocols to neutralize each one.
The Big Five Costly BiasesThe Big Five Costly Biases
Not all biases are equally consequential. These five account for disproportionate value destruction across strategic failures, investment losses, and organizational disasters.
Mechanism Deep-Dive
- Selective search: Asking questions designed to yield confirming answers
- Biased interpretation: Interpreting ambiguous evidence as supportive of existing belief
- Selective memory: Recalling supporting evidence more easily than contradicting evidence
Evidence & Case Studies
- Embraced: "Streaming bandwidth costs too high" — temporary limitation framed as permanent
- Dismissed: "Consumer preferences shifting to convenience" — explained away as niche
- Embraced: "Physical stores provide customer experience" — they wanted to believe
- Dismissed: "Netflix subscriber growth accelerating" — labeled unsustainable
Debiasing Protocol
The Triple Threat
- Overprecision: "I'm 90% sure sales will be $5–6M" when true range is $2–10M
- Overestimation: Believing you'll complete in 3 months what statistically takes 6
- Overplacement: 80% of drivers rate themselves above-average — mathematical impossibility
Evidence & Case Studies
- $4.6 billion evaporated in 4 months
- Nearly caused systemic financial collapse
- Overprecision underestimated model uncertainty
- Overplacement: believed their expertise made them immune to risks affecting others
Debiasing Protocol
Compound Mechanism
- Loss aversion: Feeling losses ~2× more intensely than equivalent gains
- Psychological commitment: Need to justify past decisions to self and others
- Social pressure: Not wanting to appear wasteful or admit mistakes
- Escalation cycle: Each additional investment increases commitment to justify previous investment
Evidence & Case Studies
- Failed product lines kept alive because "we've invested so much in development"
- Bad hires retained because "we invested significant recruiting resources"
- Failing strategies continued because "we've built the organization around this"
Debiasing Protocol
Research Evidence
Debiasing Protocol
Mechanism
- Retrieval ease heuristic: Brain uses how easily examples come to mind as proxy for frequency
- Recency bias: Recent events weighted more heavily than base rates
- Vividness effect: Dramatic events (shark attacks) perceived as more common than mundane ones (falling airplane parts — which kill 30× more)
- Emotional intensity: Fear-inducing events massively overweighted in probability estimates
Evidence & Case Studies
- After market rises: recent gains easily recalled → investing seems safe → buy high
- After market falls: recent losses easily recalled → investing seems risky → sell low
- This systematic pattern driven by availability bias destroys wealth over decades
Debiasing Protocol
Understanding these biases is step one. The next section maps how these five biases interact differently across professional domains — creating unique vulnerability profiles for strategy, investment, medical, and hiring decisions.
High-Stakes Decision PsychologyDomain Bias Vulnerability Map
How decision structure, feedback quality, and environmental complexity create unique vulnerability profiles across professional contexts.
Select a domain below to explore its bias profile
Business Strategy & Leadership
Bias Vulnerability Profile
Failure Patterns
Strategic Vision Becomes Blinders
- "Resolution isn't good enough" — temporary limitation framed as permanent
- "Consumers want physical photos" — preference was for convenience, not medium
- "Our brand protects us" — anchors became liabilities, not advantages
Escalation of Commitment to Failing Strategy
- Nokia continuing Symbian OS despite the iPhone threat
- Blockbuster doubling down on retail despite Netflix's growth
- BlackBerry defending physical keyboards despite touchscreen preference
Debiasing Protocols
Cognitive Bias
Vulnerability Profile
Fifteen scenarios that expose your invisible decision-making patterns. Your personalised vulnerability map identifies exactly where your judgment is most compromised — and what to do about it.
Most people believe they think clearly under pressure. This assessment reveals the specific patterns in your cognition that distort judgment — patterns you can't detect through introspection alone.
Answer based on your first instinct — there are no wrong answers. Honest responses produce the most useful profile.
Vulnerability by Category
Your Strongest Defence
Your Priority Debiasing Targets
What Comes Next
Start a decision journal. Before every significant decision, write one sentence about the outcome you expect and your confidence level. After one week, review your predictions against reality.
The 90-day protocol converts this entire profile into a structured intervention — systematic daily practice that produces measurable improvement in calibration and judgment quality.
Your Responses
The Debiasing Mastery Protocol
A 90-day systematic programme to identify, counteract, and permanently reduce cognitive biases in your decision-making — from individual recognition through organisational transformation to permanent integration.
Based on Kahneman, Tetlock, Klein, and 40+ years of decision science research
Day Complete
Great work on your debiasing practice.
Part 5
Risks, Limitations
& The Dark Side
Where debiasing fails — and the dangers of thinking you're immune
The most dangerous thing about learning to counteract cognitive biases is believing you've succeeded. Every debiasing technique has failure modes, and ignoring them creates a more insidious problem than the one you set out to solve: the illusion of objectivity. You become confident in your rationality precisely when you should be most suspicious of it.
Understanding where debiasing techniques break down prevents overconfidence in your own judgment and reveals when alternative approaches are not just preferable — they're necessary. What follows is an honest accounting of the costs, the limits, and the people for whom this approach does more harm than good.
Where Debiasing Fails
These failure modes affect anyone who practises debiasing. But for some, the risks are categorically different.
Who Should Not Use This Approach
Which of these describes you? Honest self-assessment is the first act of debiasing.
Critical Warning
The Overconfidence Risk in Debiasing
Here is the cruellest irony of this entire guide: learning debiasing can create new overconfidence. You know biases exist. You know the countermeasures. Therefore you believe you're immune. This is the bias blind spot — the documented tendency to believe you are less biased than others despite equal vulnerability. It is arguably the most dangerous failure mode of all, because it disarms the very vigilance that makes debiasing effective.
Self-Assessment — Check Any That Apply
- You feel comfortable evaluating domains outside your expertise because "I know how to avoid biases"
- You dismiss others' concerns because they're "obviously just biased" while you're "objective"
- You don't track calibration because you assume debiasing means you're already accurate
- You skip debiasing protocols for decisions where you feel confident
Protection Against Overconfidence
- Maintain calibration tracking — objective data on your accuracy prevents false confidence
- Practise epistemic humility — debiasing reduces errors, it doesn't eliminate them
- Default assumption: "I'm probably biased" rather than "I'm objective"
- Seek external feedback — others can see your biases when you can't
Failure modes and exclusions describe individual risks. But the deepest limitations aren't personal — they're structural. This is Part 5 of the Cognitive Biases & Heuristics field guide.
The Limits of Individual Debiasing
Most consequential biases operate at organisational and systemic levels. Individual debiasing helps, but it cannot overcome forces that are structurally embedded in the systems where decisions are made.
If you lead a team or influence organisational process, these structural interventions address what personal debiasing cannot.
System-Level Solutions
- Anonymous feedback mechanisms that bypass hierarchical filters — psychological safety surveys, blind suggestion systems, or third-party facilitated retrospectives
- Structured decision protocols embedded in organisational process — Bridgewater's "believability-weighted" decision system reduced bias-driven errors by requiring evidence-based credibility scores
- Red team requirements for high-stakes decisions — designated devil's advocates who are evaluated on the quality of their dissent, not their agreement
- Transparent evaluation criteria published before decisions are made — pre-registered hiring rubrics, investment theses, and promotion criteria that prevent post-hoc rationalisation
- Incentive alignment with long-term accuracy over short-term confidence — rewarding calibrated uncertainty rather than decisive certainty in forecasting roles
The goal was never perfection. It was less wrong, more often — with the humility to know the difference.
The risks of debiasing are real: analysis paralysis, social friction, motivation depletion, and above all, the bias blind spot that makes you overconfident in your own objectivity. Recognise these failure modes before they recognise you.
Your Questions Answered
16 research-backed answers covering the science, practice, and application of cognitive bias awareness — from fundamentals to advanced debiasing protocols.
No questions match your search
Try different keywords or
01What exactly are cognitive biases?
Cognitive biases are systematic, predictable patterns of deviation from rational judgment — not random errors, but directional mental shortcuts hardwired by evolution.
Your brain processes roughly 11 million bits of sensory information per second, but conscious thought handles only about 50. To bridge that gap, your brain relies on heuristics — mental shortcuts that compress complex decisions into manageable ones.1Judgment Under Uncertainty: Heuristics and BiasesScience, 185(4157), 1124-1131 These shortcuts worked brilliantly for ancestral survival but produce predictable errors in modern complex environments.
Unlike random mistakes, biases are directional: they push thinking in specific, identifiable ways. Anchoring pulls estimates toward initial values. Availability bias overweights vivid events. Confirmation bias filters information to match existing beliefs.
An emergency room physician who just treated three heart attack patients will overestimate the probability that the next chest-pain patient is also having a heart attack — that's availability bias, where recent vivid cases dominate judgment even when base rates suggest a much more common diagnosis.
Biases aren't flaws — they're features of a brain optimized for speed over accuracy. The goal isn't elimination but designing systems that catch the predictable errors.
02Are cognitive biases always bad?
No — and assuming they are represents its own bias. Heuristics evolved because they work remarkably well in the right contexts.
Gerd Gigerenzer's research demonstrates that simple decision rules often outperform complex analytical models in environments with high uncertainty and limited information.2Gut Feelings: The Intelligence of the UnconsciousViking Press
- Heuristics help in time-pressured, low-stakes decisions and environments with clear feedback
- Heuristics hurt in complex, high-stakes decisions requiring analytical precision — high-stakes thinking
A firefighter commander uses the recognition-primed decision heuristic to evacuate a building seconds before the floor collapses — his gut feeling outperforms any analytical model. But the same commander using gut feeling to allocate a $2M budget would likely make worse decisions than a structured cost-benefit analysis.
The goal isn't to eliminate heuristic thinking — it's to know when to trust it and when to override it with deliberate analysis.
03How many cognitive biases exist?
Over 200 have been named, but the number itself is less important than understanding the underlying mechanisms.
Many catalogued biases overlap or represent different manifestations of the same root processes. Stanovich's work on cognitive architecture identifies a smaller set of processing tendencies that generate the many named biases.3What Intelligence Tests MissYale University Press
A product manager tried to memorize 50 biases from a poster. She couldn't apply any in real time. When she narrowed to just three — confirmation bias, sunk cost fallacy, and anchoring — and created a one-page checklist, her team's feature kill rate improved by 30%.
Don't memorize 200 biases. Focus on the 15-20 most impactful in your domain and build systematic habits that address underlying mechanisms.
04Can you eliminate cognitive biases?
No — and attempting total elimination is itself a bias. The evidence-based approach is reduction through system design, not willpower.
Research consistently shows that awareness alone produces minimal debiasing effects.4DebiasingBlackwell Handbook of Judgment and Decision Making, 316-338 You can't think your way out of systematic thinking errors because the same brain doing the correcting is the one making the errors.
What works is environmental and procedural intervention: designing decision environments, pre-commitment protocols, and feedback systems — see self-coaching systems.
A portfolio manager who knows about overconfidence bias still can't prevent feeling overconfident. But a pre-commitment checklist requiring her to list three reasons she might be wrong before every trade physically intervenes in the process — reducing overconfidence's impact by 40% in controlled studies.
Stop trying to "unbias" your thinking. Instead, build systems that make biased thinking less consequential.
05What is confirmation bias and why is it so dangerous?
Confirmation bias is the tendency to seek, interpret, and remember information that confirms what you already believe — the "king of biases" because it compounds over time.
Every piece of confirming evidence strengthens your belief, creating a self-reinforcing cycle of increasingly distorted mental models.5The Case for Motivated ReasoningPsychological Bulletin, 108(3), 480-498 It operates through three mechanisms:
- Selective search: Seeking evidence that confirms rather than testing beliefs
- Biased interpretation: Viewing ambiguous evidence as supporting your position
- Selective recall: Remembering confirming evidence more easily than disconfirming
The most effective countermeasure: deliberately seeking disconfirming evidence. Philip Tetlock's "superforecasters" share this trait above all others.6Superforecasting: The Art and Science of PredictionCrown Publishers
A startup CEO convinced her product was what the market wants ran a customer survey — but only asked existing users. She ignored competitor analysis showing 60% of churned users cited the exact features she was doubling down on. Three rounds of funding later, the company pivoted to what disconfirming data had shown all along.
Before every important decision, ask "What evidence would change my mind?" — then actively seek it.
06How do biases affect decision-making under pressure?
Pressure doesn't create new biases — it amplifies existing ones by shifting your brain from deliberate analysis to automatic pattern-matching.
Under stress, your prefrontal cortex yields processing priority to faster, automatic brain systems.7Thinking, Fast and SlowFarrar, Straus and Giroux This means more anchoring, more availability bias, and less belief updating.
Pre-commitment protocols — decisions about how you'll decide, made before pressure arrives — are the most effective countermeasure. Surgeons use checklists. Pilots use emergency procedures. Athletes use mental rehearsal.
A surgeon anchored to her initial diagnosis may fail to update mid-procedure because anchoring strengthens under cognitive load. Pre-operative team briefings with explicit if-then protocols reduce this by establishing decision pathways before stress kicks in.
Build your decision system when calm. Design protocols during recovery periods, deploy them under pressure.
07What's the difference between System 1 and System 2 thinking?
System 1 is your brain's autopilot — fast, intuitive, effortless. System 2 is the deliberate co-pilot — slow, analytical, energy-intensive. Most bias comes from System 1; most correction requires System 2.
Kahneman's dual-process framework7Thinking, Fast and SlowFarrar, Straus and Giroux shows System 1 continuously generates impressions and impulses. System 2 is supposed to monitor and correct these — but it's fundamentally lazy, often accepting System 1's answers without verification.
This explains why decision fatigue makes biases worse: as System 2 depletes, System 1 runs unchecked.
Israeli parole judges grant parole at a 65% rate after meal breaks but near 0% right before them. The decisions aren't based on case merit — they're driven by System 2 depletion. As mental energy drops, judges default to System 1's easiest answer: deny (status quo).
Don't fight System 1. Design triggers that activate System 2 for important decisions: checklists, pause protocols, and structured decision frameworks.
08Can awareness of biases actually make them worse?
Yes — this is the "sophistication effect," one of the most counterintuitive findings in debiasing research.
Knowing about biases creates dangerous false confidence: "I know about anchoring, so it can't affect me." Research shows this confidence is almost entirely unjustified.8The Bias Blind SpotPersonality and Social Psychology Bulletin, 28(3), 369-381
When you believe you've already corrected for a bias, you reduce the vigilance that would actually protect you. Build psychological flexibility as a foundation for genuine improvement.
A senior investment analyst who completed an advanced behavioral finance course became more overconfident in his stock picks — not less. A junior colleague who used a simple pre-trade checklist outperformed him despite having no formal bias training.
Awareness is the beginning, not the end. Pair knowledge with protocols: decision journals, pre-commitment checklists, and accountability structures.
09How do I debias my decision-making process?
Effective debiasing is about architecture, not willpower. The highest-impact strategies design your environment and process to catch biases structurally.
Larrick's comprehensive review4DebiasingBlackwell Handbook of Judgment and Decision Making, 316-338 identifies the most effective interventions:
- Pre-commitment protocols: Decide criteria before evaluating options — reduces anchoring and motivated reasoning
- Devil's advocate: Formally assign someone to argue against the dominant position — combats groupthink
- Pre-mortem analysis: "It's one year later and this failed. Why?" — among the most powerful tools available9Performing a Project PremortemHarvard Business Review, 85(9), 18-19
- Decision journals: Record predictions, reasoning, and confidence before outcomes10Thinking in BetsPortfolio/Penguin
- Structured accountability: Regular review with a trusted peer
A hospital reduced diagnostic errors by 23% not through additional training, but by implementing a structured diagnostic timeout — a 90-second pause before finalizing any diagnosis where the physician must state one alternative diagnosis and identify what evidence would support it.
Start with a decision journal and one pre-mortem per week. These two practices alone will improve your decision quality more than reading about every bias ever catalogued.
10What role does the environment play in cognitive biases?
Environment is one of the most powerful — and underutilized — levers for reducing bias impact. How options are presented changes which options are chosen.
Choice architecture affects outcomes independently of the decision-maker's knowledge. Default settings alone shift organ donation rates from 12% to 99.9% between countries — not because of different values, but different form designs.7Thinking, Fast and SlowFarrar, Straus and Giroux
- Information ordering: First options get disproportionate weight (anchoring) — present options simultaneously
- Default settings: Status quo bias means defaults are retained — set defaults to the best option
- Physical workspace: Environmental design reduces cognitive load
- Social environment: Diverse teams with psychological safety surface more perspectives
A corporate cafeteria rearranged its food line to place salads before burgers and moved desserts behind an opaque partition. Without changing the menu, salad consumption increased 35% and dessert consumption dropped 20%. The same principle applies to decision environments.
Changing the environment is often easier and more effective than changing the person. Design decision environments to nudge better outcomes.
11Do experts suffer from cognitive biases?
Yes — and expertise can actually amplify certain biases, particularly overconfidence and resistance to belief updating.
Experts are more susceptible to overconfidence within their domain11The Trouble with OverconfidencePsychological Review, 115(2), 502-517 and more resistant to updating beliefs — partly because they have more sophisticated arguments for defending existing views.
However, in well-structured domains with rapid, clear feedback (chess, weather forecasting, firefighting), experts develop genuine intuitive expertise. The key variable is feedback quality.
Experienced radiologists miss approximately 30% of visible lung cancers on chest X-rays — not because they lack knowledge, but because expertise creates pattern-matching shortcuts that overlook anomalies. Hospitals that added AI as a second reader saw detection rates improve by 11%.
Expertise makes you more capable but not less biased. The best experts combine domain knowledge with active debiasing systems.
12How do cognitive biases affect teams and organizations?
Groups don't average out biases — they amplify them through social dynamics that reward conformity and suppress dissent.
Irving Janis's research on groupthink12Groupthink: Psychological Studies of Policy DecisionsHoughton Mifflin demonstrated how social pressure toward consensus produces worse decisions:
- Shared information bias: Teams discuss what everyone already knows while ignoring unique knowledge
- Social conformity: People adjust opinions toward the group norm, even when they privately disagree
- Authority bias: The highest-status person's opinion anchors discussion
- Polarization: Groups reach more extreme positions than any individual member held initially
Amazon's "six-page memo" tradition requires meeting leaders to write structured analyses that participants read silently before discussion — ensuring independent thinking occurs before social influence kicks in.
The highest-performing teams aren't bias-free — they're bias-aware and structurally designed to surface dissent.
13What is the bias blind spot?
The bias blind spot is the tendency to see cognitive biases in others while failing to recognize them in yourself — a meta-bias that undermines all other debiasing efforts.
Emily Pronin's research8The Bias Blind SpotPersonality and Social Psychology Bulletin, 28(3), 369-381 found approximately 85% of people rate themselves as less biased than average — a statistical impossibility. The mechanism is "naïve realism": the conviction that you see the world objectively while others' views reflect their biases.13Naive Realism: Implications for Social ConflictValues and Knowledge, Lawrence Erlbaum
Hiring managers who completed bias training consistently rate their own decisions as less biased than colleagues' — while showing identical levels of actual bias in blind evaluations. The training increased confidence without improving performance.
The moment you're certain you're being objective is when you should be most suspicious. Build healthy self-skepticism as a core skill.
14How do emotions interact with cognitive biases?
Emotions don't just influence biases — they activate specific ones. Each emotional state opens the door to a predictable set of cognitive distortions.
The affect heuristic shows people substitute "What do I think?" with "How do I feel?" — allowing emotional state to drive ostensibly rational judgments.14The Affect Heuristic in Judgments of Risks and BenefitsJournal of Behavioral Decision Making, 13(1), 1-17 Jennifer Lerner's research maps specific emotions to specific biases:15Emotion and Decision MakingAnnual Review of Psychology, 66, 799-823
- Fear activates loss aversion and worst-case thinking
- Excitement amplifies optimism bias and overconfidence
- Anger increases stereotyping and reduces analytical depth
- Sadness increases risk-seeking and impatience for immediate rewards
A venture capitalist excited about a charismatic founder rates the startup's market potential 40% higher than when evaluating the same data neutrally. A 24-hour cooling period between pitches and investment decisions reduced this affect-driven distortion significantly.
Never make important decisions at emotional extremes. Build a cooling period and practice stress awareness as a debiasing tool.
15Can AI and technology help reduce cognitive biases?
Yes — but with an important caveat. AI can structurally support debiasing, but over-reliance creates automation bias.
- Devil's advocate: Systematically generating counterarguments human teams struggle to produce
- Base rate retrieval: Surfacing statistical base rates humans chronically underweight
- Pre-mortem generation: Exhaustively generating failure scenarios faster than human brainstorming
- Decision audit trails: Automated logging for accountability and later review
However, Parasuraman and Manzey's research16Complacency and Bias in Human Use of AutomationHuman Factors, 52(3), 381-410 shows people over-rely on automated systems — accepting AI recommendations uncritically.
A consulting firm built an internal AI tool that generates three counterarguments to any strategic recommendation. Within six months, client satisfaction scores rose 15% because recommendations became more nuanced and addressed objections proactively.
Use AI to challenge your thinking, not replace it. The best debiasing combines human self-awareness with technological support.
16Where should I start if I want to improve my decision-making?
Start with the "Big Three" biases that affect virtually every decision domain, then build one simple practice that improves everything else.
- Confirmation bias: Seeking evidence that confirms existing beliefs — counter by asking "What would change my mind?"
- Overconfidence bias: Overestimating accuracy of your judgments — counter by assigning probability ranges, not point estimates
- Sunk cost fallacy: Continuing failed investments because of past costs — counter by asking "If I were starting fresh, would I make this choice?"
Then begin a decision journal — write down your reasoning and predictions before outcomes, then review monthly.10Thinking in BetsPortfolio/Penguin See building systematic habits for implementation.
A marketing director started a decision journal — one paragraph before each campaign launch with her prediction, reasoning, and confidence level. After three months, she discovered systematic overconfidence in social media and underconfidence in email campaigns. Shifting budget accordingly improved ROI by 28%.
Start a decision journal this week. Write three sentences before each important decision: what you decided, why, and how confident you are. Review in 30 days.
You've explored all 16 questions
Ready to go deeper? The full Cognitive Bias Myths article provides comprehensive protocols, advanced frameworks, and implementation systems.
Building Bias-Resistant Judgment
From understanding to implementation — your complete framework for systematic decision excellence.
Cognitive biases aren't personality flaws, education gaps, or moral failings — they're hardwired features of human cognition that evolved for ancestral survival, not modern decision-making accuracy.
Your brain systematically misleads you not because something is wrong with you, but because automatic System 1 processing operates according to rules optimized for a different environment than the one you inhabit.
The Compounding Effect
If debiasing improves decision accuracy by 10% across 10,000 consequential career decisions, that's 1,000 better outcomes. Given that major strategic, investment, and hiring decisions carry six- to seven-figure consequences, the cumulative value measures in millions — plus relationships preserved, health protected, and disasters averted.
Business
Strategic pivots that save millions, hiring decisions that build championship teams
Investing
Emotional trading eliminated, risk assessment sharpened, compounding returns protected
Medicine
Diagnostic errors reduced, treatment precision increased, patient outcomes transformed
Personal
Career-defining choices made with clarity, relationships deepened through better judgment
The Practice Requirement
Transformation requires practice, not just knowledge. You cannot read about debiasing and expect improvement any more than reading about flow states produces them.
Your Next Steps
-
Next 24 HoursEstablish Your BaselineIdentify your Big Five vulnerability. Start your decision journal. Make 10 calibrated predictions with explicit confidence levels.
-
Next 30 DaysBuild the FoundationComplete the Foundational Protocol through daily bias recognition practice, weekly calibration review, and your first monthly audit.
-
Next 90 DaysExpand & SystematizeMaster the full bias taxonomy and implement organizational debiasing within your team. Build domain-specific checklists and establish your accountability partnership.
-
1–3 YearsAchieve MasteryReach 10–15% calibration error with automatic multi-bias pattern recognition. Teach and lead organizational debiasing at scale.
- Seeing patterns others miss
- Avoiding traps others fall into
- Updating beliefs on evidence
- Calibrating confidence to accuracy
- Fewer consequential errors
"The first principle is that you must not fool yourself — and you are the easiest person to fool." — Richard Feynman
What You Need to Remember
Ten principles from dual-process psychology, calibration research, and organizational decision science — distilled into what actually changes behaviour.
Biases are systematic, not random
They follow predictable patterns built into dual-process brain architecture where automatic System 1 operates before deliberate System 2 can intervene.
Explore: Module 1 — Dual-Process Architecture →Five biases drive most catastrophic failures
Confirmation bias, overconfidence, sunk cost fallacy, anchoring, and availability bias account for disproportionate value destruction across strategic failures, investment losses, and medical errors.
Explore: Module 2 — The Big Five Framework →Intelligence doesn't protect you
Smart people rationalize biased conclusions more eloquently, and domain expertise is specific rather than transferable to bias resistance.
Explore: Module 3 — The Blind Spot Effect →Biases compound multiplicatively
Confirmation + overconfidence + sunk cost creates catastrophic cascades where initial errors escalate into strategic disasters rather than being corrected.
Explore: Module 4 — Cascade Dynamics →Awareness alone is insufficient
Knowing biases exist doesn't prevent them. You need systematic debiasing protocols with external feedback loops — not just intellectual understanding.
Explore: Module 5 — From Knowledge to Practice →Stress amplifies every bias
Time constraints, cognitive load, emotional arousal, and ego involvement amplify bias vulnerability precisely when accuracy matters most.
Explore: Module 3 — Stress & Decision Quality →Install checks, don't eliminate biases
Systematic circuit breakers catch biases before they compound. You can't prevent automatic System 1 processing — but you can install checkpoints that trigger deliberate review.
Explore: Module 4 — Circuit Breaker Protocols →Calibration training works — 30-50% improvement
Tracking predictions against outcomes with confidence levels improves decision accuracy measurably within 6-12 months of deliberate practice.
Explore: Module 5 — Calibration Methodology →Organizations need process design, not training
Pre-mortems, red teams, structured decisions, and anonymous feedback mechanisms reduce group-level biases that individual training alone can't address.
Explore: Module 5 — Organizational Debiasing →Debiasing is a skill requiring practice
Improvement comes from deliberate application with feedback, not from reading about biases. Track metrics, journal decisions, maintain calibration discipline.
Continue Your Journey
References
0 sources cited — journal articles, foundational texts, and landmark studies in cognitive science and behavioural economics
- 1(1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140.
- 2(2003). "Coherent arbitrariness": Stable demand curves without stable preferences. The Quarterly Journal of Economics, 118(1), 73–106.
- 3(2000). Trading is hazardous to your wealth. The Journal of Finance, 55(2), 773–806.
- 4(2009). Do retail trades move markets?. The Review of Financial Studies, 22(1), 151–186.
- 5(1998). Ego depletion: Is the active self a limited resource?. Journal of Personality and Social Psychology, 74(5), 1252–1265.
- 6(2008). The brain's default network. Annals of the New York Academy of Sciences, 1124, 1–38.
- 7(1992). Motivated skepticism. Journal of Personality and Social Psychology, 63(4), 568–584.
- 8(2006). Playing dice with criminal sentences. Personality and Social Psychology Bulletin, 32(2), 188–200.
- 9(2006). The anchoring-and-adjustment heuristic. Psychological Science, 17(4), 311–318.
- 10(2008). Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology, 59, 255–278.
- 11(2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13(1), 1–17.
- 12(1975). Hindsight ≠ foresight. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299.
- 13(2001). First offers as anchors. Journal of Personality and Social Psychology, 81(4), 657–669.
- 14(2004). Dread risk, September 11, and fatal traffic accidents. Psychological Science, 15(4), 286–287.
- 15(1988). On cognitive busyness. Journal of Personality and Social Psychology, 54(5), 733–740.
- 16(2005). Diagnostic error in internal medicine. Archives of Internal Medicine, 165(13), 1493–1499.
- 17(2004). The neural bases of cognitive conflict and control in moral judgment. Neuron, 44(2), 389–400.
- 18(2007). How Doctors Think. Houghton Mifflin. Book
- 19(1994). Beauty and the labor market. American Economic Review, 84(5), 1174–1194.
- 20(2011). Thinking, Fast and Slow. Farrar, Straus and Giroux. Book
- 21(1998). Aspects of investor psychology. Journal of Portfolio Management, 24(4), 52–65.
- 22(1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–291.
- 23(2016). Neural correlates of maintaining one's political beliefs in the face of counterevidence. Scientific Reports, 6, 39589.
- 24(2007). Performing a project premortem. Harvard Business Review, 85(9), 18–19.
- 25(1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.
- 26(2004). Debiasing. In Koehler & Harvey (Eds.), Blackwell Handbook of Judgment and Decision Making, pp. 316–337. Blackwell Publishing. Chapter
- 27(2015). Emotion and decision making. Annual Review of Psychology, 66, 799–823.
- 28(1982). Calibration of probabilities. In Kahneman, Slovic, & Tversky (Eds.), Judgment Under Uncertainty, pp. 306–334. Cambridge University Press. Chapter
- 29(1978). Judged frequency of lethal events. Journal of Experimental Psychology: Human Learning and Memory, 4(6), 551–578.
- 30(1979). Biased assimilation and attitude polarization. Journal of Personality and Social Psychology, 37(11), 2098–2109.
- 31(2010). The case for behavioral strategy. McKinsey Quarterly, 2(1), 30–43.
- 32(2010). Effect of availability bias and reflective reasoning on diagnostic accuracy. JAMA, 304(11), 1198–1203.
- 33(2008). The trouble with overconfidence. Psychological Review, 115(2), 502–517.
- 34(1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
- 35(1987). Experts, amateurs, and real estate. Organizational Behavior and Human Decision Processes, 39(1), 84–97.
- 36(2007). Pervasiveness and correlates of implicit attitudes and stereotypes. European Review of Social Psychology, 18(1), 36–88.
- 37(1991). Everyday reasoning and the roots of intelligence. In Voss, Perkins, & Segal (Eds.), Informal Reasoning and Education, pp. 83–105. Lawrence Erlbaum Associates. Chapter
- 38(2002). The bias blind spot. Personality and Social Psychology Bulletin, 28(3), 369–381.
- 39(2012). Hiring as cultural matching. American Sociological Review, 77(6), 999–1022.
- 40(1998). The validity and utility of selection methods in personnel psychology. Psychological Bulletin, 124(2), 262–274.
- 41(1985). The disposition to sell winners too early and ride losers too long. The Journal of Finance, 40(3), 777–790.
- 42(2008). Unconscious determinants of free decisions in the human brain. Nature Neuroscience, 11(5), 543–545.
- 43(2009). What Intelligence Tests Miss. Yale University Press. Book
- 44(2008). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94(4), 672–695.
- 45(1987). Behavior in escalation situations. Research in Organizational Behavior, 9, 39–78.
- 46(2005). Expert Political Judgment. Princeton University Press. Book
- 47(2015). Superforecasting. Crown Publishers. Book
- 48(1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25–29.
- 49(1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.
- 50(1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.
- 51(2008). Heuristics and biases as measures of critical thinking. Journal of Educational Psychology, 100(4), 930–941.
- 52(1994). Mental contamination and mental correction. Psychological Bulletin, 116(1), 117–142.
- 53(1980). Feeling and thinking: Preferences need no inferences. American Psychologist, 35(2), 151–175.
- 54(1983). Is memory schematic?. Psychological Bulletin, 93(2), 203–231.
- 55(2007). Overcoming intuition: Metacognitive difficulty activates analytic reasoning. Journal of Experimental Psychology: General, 136(4), 569–576.
- 56(2008). Thinking and Deciding (4th ed.). Cambridge University Press. Book
- 57(2012). Judgment in Managerial Decision Making (8th ed.). Wiley. Book
- 58(2005). The Iowa Gambling Task and the somatic marker hypothesis. Trends in Cognitive Sciences, 9(4), 159–162.
- 59(2008). Inflexibility of experts—Reality or myth?. Cognitive Psychology, 56(2), 73–102.
- 60(1985). Effects of stereotypes on decision making. Journal of Personality and Social Psychology, 48(2), 267–282.
- 61(1999). Overconfidence and excess entry. American Economic Review, 89(1), 306–318.
- 62(2002). Incorporating the irrelevant: Anchors in judgments of belief and value. In Gilovich, Griffin, & Kahneman (Eds.), Heuristics and Biases, pp. 120–138. Cambridge University Press. Chapter
- 63(2003). The importance of cognitive errors in diagnosis. Academic Medicine, 78(8), 775–780.
- 64(1988). Rational Choice in an Uncertain World. Harcourt Brace Jovanovich. Book
- 65(2006). Frames, biases, and rational decision-making in the human brain. Science, 313(5787), 684–687.
- 66(1990). The overconfidence effect in social prediction. Journal of Personality and Social Psychology, 58(4), 568–581.
- 67(1978). Confidence in judgment: Persistence of the illusion of validity. Psychological Review, 85(5), 395–416.
- 68(2011). Checklists to reduce diagnostic errors. Academic Medicine, 86(3), 307–313.
- 69(2005). Cognitive reflection and decision making. Journal of Economic Perspectives, 19(4), 25–42.
- 70(1991). How We Know What Isn't So. Free Press. Book
- 71(2002). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. Book
- 72(2008). Modeling option and strategy choices with connectionist networks. Judgment and Decision Making, 3(3), 215–228.
- 73(1992). The weighing of evidence and the determinants of confidence. Cognitive Psychology, 24(3), 411–435.
- 74(2010). Rational Choice in an Uncertain World (2nd ed.). Sage Publications. Book
- 75(1998). Cognitive repairs. Research in Organizational Behavior, 20, 1–37.
- 76(2001). Educating Intuition. University of Chicago Press. Book
- 77(1982). Groupthink (2nd ed.). Houghton Mifflin. Book
- 78(2011). The evolution of overconfidence. Nature, 477(7364), 317–320.
- 79(2011). Before you make that big decision. Harvard Business Review, 89(6), 50–60.
- 80(1996). On the reality of cognitive illusions. Psychological Review, 103(3), 582–591.
- 81(1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211–228.
- 82(2002). The calibration of expert judgment. In Gilovich, Griffin, & Kahneman (Eds.), Heuristics and Biases, pp. 686–715. Cambridge University Press. Chapter
- 83(1999). Unskilled and unaware of it. Journal of Personality and Social Psychology, 77(6), 1121–1134.
- 84(1997). Prospect theory, rational choice, and international relations. International Studies Quarterly, 41(1), 87–112.
- 85(1977). Do those who know more also know more about how much they know?. Organizational Behavior and Human Performance, 20(2), 159–183.
- 86(1954). Clinical Versus Statistical Prediction. University of Minnesota Press. Book
- 87(2011). True overconfidence. Organizational Behavior and Human Decision Processes, 116(2), 262–271.
- 88(2009). How can decision making be improved?. Perspectives on Psychological Science, 4(4), 379–383.
- 89(1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm. Journal of Experimental Social Psychology, 35(2), 136–164.
- 90(1980). Human Inference. Prentice-Hall. Book
- 91(1993). The Psychology of Judgment and Decision Making. McGraw-Hill. Book
- 92(2004). Objectivity in the eye of the beholder. Psychological Review, 111(3), 781–799.
- 93(1996). Naive realism in everyday life. In Brown, Reed, & Turiel (Eds.), Values and Knowledge, pp. 103–135. Lawrence Erlbaum Associates. Chapter
- 94(1991). Ease of retrieval as information. Journal of Personality and Social Psychology, 61(2), 195–202.
- 95(1993). Reason-based choice. Cognition, 49(1–2), 11–36.
- 96(2011). False-positive psychology. Psychological Science, 22(11), 1359–1366.
- 97(2007). The affect heuristic. European Journal of Operational Research, 177(3), 1333–1352.
- 98(2002). Risk and Reason. Cambridge University Press. Book
- 99(1980). Toward a positive theory of consumer choice. Journal of Economic Behavior & Organization, 1(1), 39–60.
- 100(2008). Nudge. Yale University Press. Book
- 101(1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453–458.
- 102(1983). Extensional versus intuitive reasoning. Psychological Review, 90(4), 293–315.
- 103(1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39(5), 806–820.
- 104(2003). Affective forecasting. Advances in Experimental Social Psychology, 35, 345–411.
- 105(1996). A new look at anchoring effects. Journal of Experimental Psychology: General, 125(4), 387–402.
- 106(2013). From mindless to mindful practice. New England Journal of Medicine, 368(26), 2445–2448.
- 107(1982). Debiasing. In Kahneman, Slovic, & Tversky (Eds.), Judgment Under Uncertainty, pp. 422–444. Cambridge University Press. Chapter
- 108(2007). Gut Feelings: The Intelligence of the Unconscious. Viking. Book
- 109(2000). When choice is demotivating. Journal of Personality and Social Psychology, 79(6), 995–1006.
- 110(2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515–526.
- 111(2016). Weapons of Math Destruction. Crown Publishing. Book