it's ludicrous how few people know about this paper, so, friendly reminder that the fermi paradox was completely resolved in 2018 and it turned out to be because multiplying point estimates of highly uncertain parameters is very bad actually
arxiv.org/abs/1806.02404
Conversation
there was even an SSC post about this paper and still nobody knows about it smh
4
7
186
just an embarrassing chapter in the intellectual history of humanity tbh. decades of ink spilled over what amounts to a failure to understand that the product of a bunch of independent random variables is ~lognormal (ish) and a highly uncertain lognormal has a very heavy tail
Replying to
my favorite point that isn't just "lol git gud at probability" is that the most uncertainty by far in the drake equation is about the rate at which earth-like planets produce life; they argue for uncertainty over 200 orders of magnitude which is where the tail comes from
6
6
202
4
79
Replying to
Moreover, if those independent random variables were themselves roughly lognormally distributed, the posterior point estimates are just precision-weighted fractions of the observation.
In other words, there is no particular reason to believe in a single "great filter".
1
1
When we observe that the result of the Drake equation is lower than expected, no reason to expect it's because of just one term; we should probably just update our posterior on all of the terms downwards, each by a log factor proportional to the SD on its log prior.
1
1
Show replies
Replying to
To me it is still a significant step in modern civilization that this question was asked, even if we went about working on it the wrong way for so long.
1
Replying to
I didn't quire get what the very heavy tail comment here signifies exactly... care to explain pls?
Replying to
this is an absurdly dismissive way to talk about the way science works. slate star codex isn’t a reliable source for anything but feeling superior
3





