Kahneman on decision variability, algorithms, and the organizational roots of bias
The Problem of Noise
When fifty underwriters at an insurance company were given six identical cases, they disagreed 56% of the time. The expected variance was 10–15%. This gap — this scatter — is what Kahneman calls noise: unwanted variability in judgments that should be identical.
Noise is distinct from bias. Bias is a systematic lean in one direction. Noise is random dispersion — different people, different days, different moods, all producing different answers to the same question. Both are errors, but noise is often invisible because organizations don't measure it.
"If you have a lot of noise, it sets a ceiling about how accurate you can be. So, noise is a mistake."
Algorithms as Noise Filters
The case for algorithms isn't that they're smarter. It's that they're consistent. Present the same problem twice, get the same output. Humans can't promise that.
- Algorithms filter out noise
- Human biases are the bigger risk — because you can trace and analyze algorithms, but human bias hides in impressions and moods
- The ideal: combine humans and machines, with the machine having the last word
- Humans supply context, impressions, edge-case awareness
- Algorithms integrate information reliably
The danger of override: when people are allowed to override algorithms, they do so too often — and on the basis of impressions that are biased, inaccurate, and noisy.
The Upstream Problem
"If you get disastrous outcomes from an algorithm, the problem is upstream. It's in the org, the data — it comes from humans."
Kahneman's most striking point: in sexist hiring, a predictively accurate algorithm will penalize women — because the organization penalizes women. The algorithm merely reflects the system it was trained on.
The fix isn't to retrain the algorithm alone. Fix the organization first. Then retrain.
This reframes the entire AI-bias debate: the problem isn't the mirror, it's what the mirror reflects.
Threads
- Expertise & Feedback — Talib: "I'm not sure they know anything because there's no feedback." Without feedback, there is no learning. Without learning, noise compounds.
- Brooks' Law — Complex systems can't be perfectly partitioned. Human coordination introduces noise at every handoff.
- Content Shock — When information overload meets noisy human judgment, the result is worse decisions, not just slower ones.
- Commitment Over Balance — The sinecure is a noise-generating environment: low feedback, low stakes, low growth.