Something that I’m chewing on that I haven’t quite pinned down (or properly developed a view on): Bayesian thinking may well be the most effective way to think when faced with uncertainty, but Bayes’s Theorem may be the wrong way to teach it.
Conversation
This tweet brought to you by the observation that some of the most intuitive Bayesian thinkers I know don’t explicitly update using percentages.
Instead they seem to do something different. Many of them seem to generate multiple explanatory stories instead and hold them loosely.
4
1
7
The end result might be a Bayesian updating process, but the internal machinery is very different.
Human brains don’t seem particularly well suited to calculating priors and percentages but they seem particularly well suited to generating explanatory narratives.
2
3
Mathematically inclined people seem to enjoy talking about Bayes’ Theorem, and they seem to be able to explicitly calculate priors/percentages.
But I wonder if it’s the only way to get there. It seems to go against the grain of the mind (eww maths; yay stories).
1
1
For an entertaining example of this, check out ’s story of investing in a pachinko company.
Totally Bayesian in thinking style; totally not Bayes’ Theorem in execution.
2
11
Seems (!) like one application is still working.
1
2
Aye! The GJP is why I’m not 100% sure of this thesis.
But, I should note there’s a selection bias inherent to the GJP. The people who do become superforecasters are taught to produce and communicate in explicit %s, so selects for the math-inclined.
Oh totes, it’s just a theoretical rebuttal to the “forecasting is pointless” trope.
1
1
I really respect the superforecasters! Tried their approach and got similar results, but it felt so unpleasant to do. Sort of like spaced repetition - I know it's good but it doesn't harmonize with my style
2


