Book Notes: Thinking, Fast and Slow
Daniel Kahneman’s Thinking, Fast and Slow is one of those books that reorganises how you see your own mind. Published in 2011, it summarises decades of research on cognitive biases, heuristics, and the dual-process theory of human judgment. Here are my notes, organised around what I found most useful.
The Core Frame: System 1 and System 2
Kahneman divides cognition into two metaphorical “systems”:
- System 1 — fast, automatic, emotional, associative. Fires without effort. Handles driving a familiar route, reading a face, or recognising that 2+2=4.
- System 2 — slow, deliberate, effortful, logical. Needed for any non-trivial calculation, careful argument-following, or fighting an impulse.
The central insight: we think we’re mostly using System 2, but we’re actually mostly using System 1 — and System 2 is often just rationalising what System 1 already decided.
Heuristics and Their Failures
System 1 navigates the world with mental shortcuts (heuristics). These work well on average but fail systematically in predictable ways.
Anchoring
If I ask you to estimate the population of Istanbul, and first show you the number 10 million, your estimate will be higher than if I showed you 2 million — even if you consciously dismiss the anchor as irrelevant.
Lesson: negotiators, appraisers, and judges all fall for anchoring. Always generate your own estimate before seeing any external figure.
Availability heuristic
We estimate frequency by how easily examples come to mind. Plane crashes feel more frequent than they are because they’re vivid and covered extensively; heart disease kills far more but doesn’t make the evening news as dramatically.
WYSIATI — What You See Is All There Is
System 1 builds the most coherent story it can from available evidence, without flagging what it doesn’t know. This is why:
- We’re overconfident (our story feels complete)
- We jump to conclusions on thin evidence
- We don’t naturally ask “what’s missing?”
“The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”
The Planning Fallacy
One of the most practically important sections: we systematically underestimate how long and costly projects will be because we focus on the inside view (our specific plan) rather than the outside view (base rates for similar projects).
Fix: reference class forecasting. Before estimating, ask “What is the track record of similar projects?” Then adjust from that anchor.
This is something I now apply to any estimate I’m asked to make. For ML projects especially, the outside view is brutal but correct.
What Hasn’t Aged Well
Since publication, the replication crisis hit many of the priming studies Kahneman relied on. The “Florida effect” (thinking about the elderly makes you walk slower) has failed to replicate. Ego depletion is contested.
The dual-system framework itself is a useful metaphor, not a literal description of brain architecture. Don’t mistake it for neuroscience.
The core findings — anchoring, WYSIATI, prospect theory, loss aversion — hold up well. But treat the priming chapters with more scepticism than Kahneman himself might endorse today.
Prospect Theory (the Part I re-read most)
Kahneman and Tversky’s Nobel-winning contribution: humans don’t evaluate outcomes in absolute terms, but relative to a reference point, and losses loom larger than equivalent gains (roughly 2:1).
This predicts:
- Why people gamble to avoid losses but accept certain smaller gains
- Why framing matters (same thing sounds better as “90% survival” than “10% mortality”)
- Why investors hold losing stocks too long
Key Takeaways
After reading and re-reading sections:
- Slow down on important decisions — force System 2 to engage before committing.
- Pre-mortem on any major project — assume it failed; now explain why. Surfaces blind spots.
- Always ask for the outside view — before planning, find base rates.
- Identify your reference point in any negotiation — it’s controlling you whether or not you notice it.
- Be suspicious of “obvious” conclusions — if a story feels complete, ask what it’s missing.
Next in this series: notes on Kahneman’s collaborators and critics — Gigerenzen’s “Rationality for Mortals” and Thaler’s “Misbehaving”.