Loading...
Loading...
The most dangerous cognitive state is certainty. When you're certain, you stop seeking new information, dismiss contradictory evidence, and become vulnerable to confirmation bias at maximum strength.
Epistemic humility is the practice of calibrating your confidence to your evidence. A weather forecast might be 90% confident — it explicitly acknowledges a 10% chance of being wrong. Most human beliefs lack this calibration: people feel 95% confident about claims where the evidence justifies 60% confidence, or 99% confident about positions where honest assessment would be 75%.
Philip Tetlock's research on "superforecasters" found that the most accurate predictors share a key trait: they assign probabilities to their beliefs and update them regularly. They think in bets, not certainties. "I'm 70% confident that..." is a more useful and more honest statement than "I know that...".
Steel-manning — the opposite of straw-manning — means constructing the STRONGEST version of an opposing argument before responding. If you can't articulate why a reasonable person might hold the opposing view, you don't understand the issue well enough to have a strong opinion. This doesn't require agreeing — it requires understanding.
Concrete practices for epistemic calibration:
The pre-mortem: before committing to a decision, imagine it failed. What went wrong? This activates critical thinking that optimism bias suppresses.
The ideological Turing test: can you describe the opposing position so well that an adherent would recognize it as their own? If not, your understanding is insufficient for strong opinions.
Update tracking: keep a record of beliefs you've changed. If you can't remember the last time you changed your mind about something significant, you're probably not updating on evidence.
Confidence calibration: for important beliefs, ask: "What evidence would change my mind?" If no possible evidence could change your mind, you're holding a faith position, not an evidence-based one.
The scout mindset (Julia Galef): approaching information as a scout (trying to map reality accurately) rather than a soldier (trying to defend your existing position). Scouts are rewarded for accuracy. Soldiers are rewarded for winning. Most people default to soldier mindset in discussions about topics tied to their identity.
Tip
The single most diagnostic question for epistemic humility: "What specific evidence would change my mind about this?" If you can answer concretely, you're in scout mindset. If you can't — or if you find yourself saying "nothing would change my mind" — you're holding a faith position, not an evidence-based one.
Certainty is the enemy of good thinking. Calibrate confidence to evidence. Steel-man opposing positions before arguing against them. Track your belief updates — if you can't remember the last time you changed your mind, you're not engaging with evidence. Ask: "What specific evidence would change my mind?" If nothing would, you're holding a faith position.
Keep reading to complete