Roko's basilisk is built on the idea that it is rational for sheep to open the gate to the wolf if the wolf promises to only eat all the other sheep
-
-
Replying to @Plinz
Yes, except that sheep are a valuable resource to the wolf; eating them is not quite the same as a punishment. If sheep were an organized society, the value of punishment by some AI overlord would be very context-sensitive, making it unpredictable, I think.
1 reply 0 retweets 0 likes -
Replying to @mjambon
The problem is that once the wolf is inside the gates, it has no more incentive to care about its promises and threats. Since this is not a repeated game, the wolf is not going to reward or punish you, but simply eat you.
1 reply 0 retweets 4 likes -
Replying to @telephotic @mjambon
Most AI's won't want to obliterate us in an evolutionary competition. Only the most successful ones will do that.
1 reply 1 retweet 1 like -
Replying to @Plinz @telephotic
I don't know why you're saying that. There's no proper "us" to start with; humans are already killing each other. Then, I don't know about a competition. Which resource would be at stake?
1 reply 0 retweets 0 likes -
Replying to @mjambon @telephotic
A human requires four ha land just to be fed. There could be solar cells on that land?
1 reply 0 retweets 1 like -
Replying to @Plinz @telephotic
They could colonize places like Mars where no human would bother them. Even humans are smart enough to try to preserve natural resources for simple enjoyment. I don't see why a superior intelligence would not preserve humans for similar reasons.
1 reply 0 retweets 0 likes
Consciousness does by itself not value anything, even though it does not emerge without an anticipated reward from invoking it. The relevance that gives the taste of valenced reality to your mental states is entirely motivational.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.