Many are terrified of Roko’s Basilisk, but I’m more worried about Grey Goo/the Paperclip Maximiser. The idea of nanotech carrying out a mundane task so efficiently that it essentially gives reality cancer is more fucked up, and more plausible, than a vindictive cyber-Yahweh.
-
-
Ya, I'm familiar with the idea, that's why I'm questioning the extrapolation of the premise. Things are always more boring in practice than in our imagination. Intelligence seems to interfere with rote, copy paste results like paper clip maximizing, not encourage it.
-
So only a stupid system would actually pursue a stupid goal to completion. But a stupid system would fail to deal with enough special cases and changing circumstances to actually succeed at goal completion. A system smart enough would also be smart enough to switch goals.
- 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.