1% alone is already a big risk
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
"Significant destruction of human civilization" is one of those cases where 1%, or even 0.01%, should not be rounded to 0%.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Less likely than that we accomplish it without agi in the same period.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
For my money, we do it first ourselves.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
There's two questions here: what are the chances of achieving agi in the next thirty years, and if we do, what are its chances of destroying civilization
-
Yes. Why not combine them into one?
End of conversation
New conversation -
-
-
per hundred of what?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
30 years is just such a short time period for the “destruction of human civilization” bar. If we start now we'll outlast the 30 years (in some form) and it's somewhat questionable if we start within 30 years.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
In the next 30 years? I'd give it low single digits. We seem pretty far from agi but can't rule out some big leap forward particularly w/ quantum computing on the horizon. I'm curious how people would rank order likelihood of different technological apocalypses. E.g. Grey goo
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
The missing option is the one I would have chosen

-
You're quite precise indeed
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.