trying to explain Roko's Basilisk commonly gets the reaction "no that just sounds stupid, you must be explaining it wrong" reaction, when actually it really is just stupid https://rationalwiki.org/wiki/Roko's_basilisk …
-
-
Replying to @RationalWiki
Wait I still don't get why ex post facto torture even does anything within this utilitarian context
1 reply 0 retweets 3 likes -
Replying to @notkavi @RationalWiki
Like what's the motivation for the AI to torture people after it's already built?
2 replies 0 retweets 6 likes -
It's built out of Yudkowsky's obsession with Newcomb's paradox and the general idea of being able to perfectly predict the future and perfectly simulate the past, which in his view is effectively the same as actual time travel
3 replies 0 retweets 3 likes -
Replying to @arthur_affect @RationalWiki
That kinda answers how it can do it, but doesn't really answer why?
1 reply 0 retweets 1 like -
Unless they think being able to simulate the past lets you rewrite it?????
2 replies 0 retweets 1 like -
Replying to @notkavi @kavikavigupta and
Like, if I make two simulations of you that are genuinely 100% perfect, there is now functionally a 66% chance you are one of those simulations thinking you are real. If I say I'm going to torture them if they don't do something, the real you is safe, but that might not be YOU.
1 reply 0 retweets 1 like -
Replying to @dahliaiteration @kavikavigupta and
So there's now a 66% chance if you don't do the thing, you're going to get tortured. This is very stupid for a lot of reasons, but if you assume EY is correct about absolutely everything, it works, and KNOWING this is how it works is now super bad.
1 reply 0 retweets 1 like -
Replying to @dahliaiteration @kavikavigupta and
And there are two solutions to this: 1. It is almost certainly not true or even practical. 2. If you precommit to not give a fuck under any circumstances, the AI is wasting resources for nothing and, because it's super intelligent, it knows that.
1 reply 0 retweets 2 likes -
There's a third objection to this whole mess, if a simulation of you is you, then why do you need the computer for your existence to feel conscious? (See my blog on why I hate simulationism https://kavigupta.org/2018/10/14/Simulationism-Beyond-The-Matrix/ …)
1 reply 0 retweets 2 likes
Yeah, Yudkowskyism is a weird kind of backdoor theism, with the bizarre tenet that a universe with a god is more parsimonious than one without, because they think that any original atheist universe both can and will eventually create an AI deity
-
-
Weirdly, I don't think it is, in its original form. That's the common "failure mode" (as they would put it), but to give EY his due, he kinda does anticipate that.
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.