It's an inside reference Basically, the group of Silicon Valley homebrew philosophers who hang out at websites like Less Wrong and Slate Star Codex are a lot like a cult One of them came up with this very complicated idea that's very culty https://rationalwiki.org/wiki/Roko%27s_basilisk …
-
-
Replying to @arthur_affect @merrickdeville and
Without getting too deep, a lot of these guys are really obsessed with AI research and the idea that if you build a computer that's smart enough to improve itself, rather than relying on humans to improve it, its intelligence will grow exponentially until it can do anything
3 replies 1 retweet 18 likes -
Replying to @arthur_affect @merrickdeville and
They are really really into this idea that someday in the future an AI will be built that will become God, essentially, and they call this event the Singularity (it'll be an event that's never happened before and the results are impossible to predict)
3 replies 0 retweets 18 likes -
Replying to @arthur_affect @merrickdeville and
Despite what I just said, they spend a lot of time trying to predict it, and talking like they can control it (even though none of them actually work on designing actual AIs or supercomputers) They take a lot of donations from their fans to have these discussions
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
One of the posters on Less Wrong, a guy named Roko, came up with this complicated idea that because the future AI God will have perfect knowledge of the past before it was created, it will know whether or not you did everything you could to help create it
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
And if you don't, say, work as hard as you can to make as much money as you can to donate it all to AI research, it will punish you by resurrecting you in the future by making a simulation of your mind, then torturing you for eternity
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
It wouldn't make any sense for it to do this if you didn't *know* about it -- you can't control people with a "threat" they've never heard -- but now that Roko has told you about this idea, now the threat has been made, and now if you don't obey you will go to Hell
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
(It's called a "Basilisk" because it's like the legendary monster -- like in Harry Potter -- where as soon as you see it you're cursed to turn to stone)
1 reply 0 retweets 14 likes -
Replying to @arthur_affect @merrickdeville and
It's really fucked up and it's exactly identical to abusive Christian preachers scamming their congregation The thing about it is that it's only scary if you're embedded in the cult and surrounded by an environment where the ideas are normalized
1 reply 2 retweets 17 likes -
Replying to @arthur_affect @merrickdeville and
@dgerard is one of the people who found out about this and was like "This shit has to be outed publicly, both because it's horrifyingly abusive and because it's fucking hilarious" It went viral on Slate in 2012 and made them look like freaks in front of all the normies1 reply 1 retweet 18 likes
And it's like I'm saying -- at this point it's just a dumb in-joke, but the "Roko's Basilisk" thing is just so fucked up and weird and obviously abusive it discredits them It doesn't have anything to do with their "arguments" but it shows the kind of people they are
-
-
Replying to @arthur_affect @merrickdeville and
And the people making Roko's Basilisk memes are doing God's work because shocking as it is these guys have a surprising amount of influence in Silicon Valley Their charities are regular top earners among the tech sector, tons of Big Tech CEOs are fans and regular readers, etc
0 replies 4 retweets 20 likes -
This Tweet is unavailable.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.