I’ll be honest: I’m still a little shaken over how dismissive I was on the lab leak hypothesis last year.
I’m not sure what the lesson is here, other than, ‘always double check the expert consensus’, but then to what degree?
Conversation
Replying to
Maybe a variation is to flip our burden of proof bias - easy to have lower burden of proof for claims that agree with our in-group / intuitions, and so potentially larger danger of falling prey to blind spots?
1
2
Replying to
I guess my problem with that is that my intuition was “listen to the expert consensus”. Which is usually a good intuition/prior to have!
1
Replying to
Hmm I see. Don't have an easy answer.
What makes this hard to debug too is that usually we listen to the expert consensus when we lack the expertise to directly evaluate the facts of the matter (which is required to appropriately critique their consensus)!
1
2
It is sort of turtles all the way down in that respect. The role of competent leadership in any serious endeavor often reduces to evaluating & comparing contrasting expert opinions. You can't really *ever* truly outsource judgment if you have a stake in the outcome...
1
3
Also related follow-up: reddit.com/r/slatestarcod
TL;DR -- even when Experts all vocally & publicly agree, the intermediary who sits between the Experts & the Audience, & who gets to ask & frame the questions, has the biggest impact on the public perception of "Expert opinion"
3
3
The older I get, the more I think that critical thinking involves filing most things under 'wait and see.'
If knowing the true causality a year ago would have changed your actions, that is easier said than done. I hope there wasn't any harm done by maybe being wrong about this.
2
3
Yea I think echoes 's point about stakes. For most people, being wrong about the lab leak hypothesis isn't v impactful. But being wrong about, say, role of aerosol transmission is: clear implications for everyday risk reduction.
1
2
Show replies



