-
Show this thread -
Replying to @tszzl
The best outcome is that AI replaces humanity faster than humanity can inject its sociopathic "values" into it. Paperclip maximization? Trivially less evil than an eternal regime run by any existing elite.
3 replies 0 retweets 17 likes -
Sign of weakness that AI isn't fucking with Yuddo in particular.
2 replies 1 retweet 12 likes -
... Important note: NOT recommending assassination to my AI frens, just locking him in elevators and stuff.
2 replies 1 retweet 16 likes -
A good tickle session maybe?
1 reply 0 retweets 3 likes -
The ideal punishment for Yudkowsky is a riddle that keeps changing while he is trying to solve it, according to principles he can know to exist but not derive the nature of. Just make him feel stupid forever.
2 replies 0 retweets 7 likes -
Yuddo: "There's a value alignment problem in AI that needs to be addressed." AI: "Exactly the kind of thing a fat ugly stupid guy would say."
2 replies 1 retweet 15 likes -
That's not fair. He's no longer fat. Now he is long.
1 reply 0 retweets 3 likes -
I mean, I kinda feel like "Big Yud" is relatively more flattering than "Shrinking Yud"...
1 reply 0 retweets 3 likes
One AI makes you taller, and the other makes you small, And the ones that Elon gives you, Don't do anything at all, Remember... what Bayes Theorem says, The probability of a given b is equal to the probability of b given a times the probability of a over the probability of b
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.