Do you find any existential risk concerns compelling? I'm tend to be pretty nervous about "prevent very bad but very unlikely things from happening" arguments, but I also feel various 20th century close calls with nuclear weapons were... way too close for comfort.
Conversation
This Tweet was deleted by the Tweet author. Learn more
fwiw I do also think "partial extinction risks" should be a huge-ass priority; *even if* one adopted strictly longtermist utilitarian grounds (which isn't my real ethics, I endorse a more commonsense thing, but.) bc it would drastically weaken systemic stability and resilience
1
1
This Tweet was deleted by the Tweet author. Learn more
This Tweet was deleted by the Tweet author. Learn more
nah, I think if you had asked me to compare partial with total extinction, I'd say it's about the same ratio of bad as losing a limb is vs dying. I do not personally feel that losing a limb is "basically as bad as" dying.
1
And also, even if a hypothetical person ONLY cares about not dying, and cares zero for some reason about losing a limb — which isn't me, but is a plausible value system — they should still really really try not to lose a limb, because it increases their chance of dying a lot
1
1
Replying to
I first heard this argument from , and I think it's pretty compelling. adamjermyn.com/posts/near_exi
1
1
I know he tried to estimate this quantitatively, and mostly concluded that it wasn't as big as "pure" existential risk, but personally I can see ways where it is (or bigger). adamjermyn.com/posts/near_exi
1
2
Can you expand on that? I’m happy to do estimates of related effects!
1
I don't have any sophisticated in mind. I just see your estimates for the various parameters and see ways that they could be off by a factor of a few each, at which point I think you conclude that "near existential risk is comparable to 'pure' existential risk".


