Accelerationism as the subjective (terrestrial) version of the Fermi Paradox:https://blogs.scientificamerican.com/observations/how-to-search-for-dead-cosmic-civilizations/ …
-
-
One of the most common X-Event scenarios is the Intelligence explosion of AI, in that case an AI would survive the X-Event and maybe continue on as its own AI civilization. There is a difference between nothing human makes it out of the near future and nothing makes it out at all
-
John Smart proposes that any sufficiently advanced intelligence becomes a synthetic black-hole, so it joins up around the back.
- 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.