tl;dr maximization of any reward function stands a good chance of wiping us out as a side effect and if the ai can recursively bootstrap its own capabilities we can't stop it
Nah. It says "existential".
Nuclear war is ugly, but not even close to an existential threat. This isn't 1971 in either scale or likelihood of escalation.
The only one that poses an existential threat is AI, so it wins by default.
The worst is nukes, but humanity would survive them.
Climate change is more of an annoyance than a threat when you zoom out to this scale.
And drug-resistant diseases will kill many, but again not all.
The fact that it's so uniform, and that *AI* gets about as many votes as *climate change*, is horrifying. The reason why we're not pressuring politicians and industry to do anything is that people don't really give a shit, because too few people realize how bad things will get.