If the humanity must end, I wish it were caused by superhuman AIs. I believe those AIs would use the most efficient (and thus least painful) way to get rid of human beings (e.g. using the "disintegrator" in Asimov's The Tercentenary Incident?).
-
-
I got your point. I think it's one of the reasons that Singularity is unpredictable: there are many paths to superhuman intelligence and they are advancing simultaneously and interacting with each other.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
Seriously, it's only my wishful thinking, there is no evidence that it would be so. I think the outcomes are more or less unpredictable though.