Many are terrified of Roko’s Basilisk, but I’m more worried about Grey Goo/the Paperclip Maximiser. The idea of nanotech carrying out a mundane task so efficiently that it essentially gives reality cancer is more fucked up, and more plausible, than a vindictive cyber-Yahweh.
-
-
Weird that you attribute absolutely unlimited, flexible intelligence to it but not the ability to question it's own paperclip goals. It can do anything except moderate paper clips, or pivot to the search for the one perfect paper clip, or abstractly contemplate paper clips.
-
The point of the thought experiment is that the AI is single-minded in its goal of maximising paper clips, no matter how smart it gets. Intelligence =/= motivation. Even the smartest human on earth wants, in the end, to engage in filthy animal sex with someone hot.
- 4 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.