Illusion of Validity
Type: Judgment β Confidence Also Known As: Confidence-accuracy gap, overprecision
Definition
Believing that our predictions, judgments, and forecasts are more accurate than they actually are. We feel confident because the evidence seems compelling β even when that confidence is unwarranted. The gap between what we believe and reality is where costly mistakes happen.
βIβm 90% confident in this forecast.β (Actual accuracy: 60%)
Form
- Evidence is gathered and analyzed
- A judgment or prediction is formed
- High confidence is assigned based on the evidence
- The confidence exceeds actual predictive accuracy
- The gap between confidence and reality goes unnoticed
Examples
Example 1: Expert Forecasting
Analysts predict stock prices with high confidence based on company fundamentals and market trends. When tracked, their predictions are barely better than chance β but the confidence remains high.
Problem: The feeling of understanding exceeds actual predictive power.
Example 2: Medical Diagnosis
A doctor is highly confident in a diagnosis based on symptoms that seem to fit perfectly. The actual diagnosis is something completely different that was missed.
Problem: Pattern-matching creates false certainty.
Example 3: Hiring Decisions
Interviewers feel confident they can assess candidate quality from a 30-minute interview. Research shows structured interviews have low predictive validity, but the illusion persists.
Problem: Social interaction feels diagnostic but isnβt.
Example 4: Coin Flip Calibration
When people predict coin flips and rate their confidence, well-calibrated people should say 50% confidence. Most say 60-70%, demonstrating the illusion β they feel they can predict randomness.
Problem: The feeling of knowledge exceeds actual knowledge.
Why It Happens
- Coherent stories feel true
- Confirmation bias supports initial assessments
- Similarity-based prediction feels reliable
- The evidence we see feels like all relevant evidence
- We donβt get feedback on wrong predictions often enough
How to Counter
- Calibration training: Practice with feedback on confidence levels
- Consider the opposite: What would prove this prediction wrong?
- Reference class forecasting: How did similar predictions turn out?
- Wider intervals: Deliberately expand confidence ranges
- Track predictions: Keep a record and score yourself
The Confidence-Accuracy Gap
Research consistently shows:
- People are overconfident in their judgments
- The gap is larger for difficult tasks
- Experts are often as overconfident as novices
- The gap persists even with feedback
Well-calibrated people are rare and valuable.
Related Concepts
- Overconfidence Effect β Broader overestimation of abilities
- Confirmation Bias β Supports the illusion with selective evidence
- Hindsight Bias β After the fact, we forget how confident we were
- Optimism Bias β Positive predictions feel more valid
References
- Kahneman, D. & Tversky, A. (1973). On the psychology of prediction
- Einhorn, H.J. & Hogarth, R.M. (1978). Confidence in judgment
- Lichtenstein, S. et al. (1982). Calibration of probabilities
Part of the Convergence Protocol β Clear thinking for complex times.