although I deeply appreciate your top class work here, you must understand that we are both profoundly lazy and tend to approach critique of the rationalist subculture as internet television, including occasionally the bit where you shout at the screen
-
-
Replying to @davidgerard @RiotAtArbys and
Laughter is really what they hate most Their natural element is trying to suck you into something as close to a formal debate as possible and waste your energy and time They can't stand just being callously dismissed with "Pfft nerd"
5 replies 7 retweets 93 likes -
Replying to @arthur_affect @davidgerard and
That's the Eliezer Yudkowsky quote the name "SneerClub" comes from in the first place They get so fucking mad at the idea of being dismissed based on emotional contempt without even bothering to read their collected reams of evidence Only Chads and Stacies do that
1 reply 5 retweets 39 likes -
Replying to @arthur_affect @davidgerard and
It's why popularizing the meme of Roko's Basilisk has been 10,000x more helpful in fighting their bullshit than any long point by point dissection from someone like me could be
2 replies 4 retweets 48 likes -
This Tweet is unavailable.
-
It's an inside reference Basically, the group of Silicon Valley homebrew philosophers who hang out at websites like Less Wrong and Slate Star Codex are a lot like a cult One of them came up with this very complicated idea that's very culty https://rationalwiki.org/wiki/Roko%27s_basilisk …
2 replies 3 retweets 28 likes -
Replying to @arthur_affect @merrickdeville and
Without getting too deep, a lot of these guys are really obsessed with AI research and the idea that if you build a computer that's smart enough to improve itself, rather than relying on humans to improve it, its intelligence will grow exponentially until it can do anything
3 replies 1 retweet 18 likes -
Replying to @arthur_affect @merrickdeville and
They are really really into this idea that someday in the future an AI will be built that will become God, essentially, and they call this event the Singularity (it'll be an event that's never happened before and the results are impossible to predict)
3 replies 0 retweets 18 likes -
Replying to @arthur_affect @merrickdeville and
Despite what I just said, they spend a lot of time trying to predict it, and talking like they can control it (even though none of them actually work on designing actual AIs or supercomputers) They take a lot of donations from their fans to have these discussions
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
One of the posters on Less Wrong, a guy named Roko, came up with this complicated idea that because the future AI God will have perfect knowledge of the past before it was created, it will know whether or not you did everything you could to help create it
1 reply 0 retweets 13 likes
And if you don't, say, work as hard as you can to make as much money as you can to donate it all to AI research, it will punish you by resurrecting you in the future by making a simulation of your mind, then torturing you for eternity
-
-
Replying to @arthur_affect @merrickdeville and
It wouldn't make any sense for it to do this if you didn't *know* about it -- you can't control people with a "threat" they've never heard -- but now that Roko has told you about this idea, now the threat has been made, and now if you don't obey you will go to Hell
1 reply 0 retweets 13 likes -
Replying to @arthur_affect @merrickdeville and
(It's called a "Basilisk" because it's like the legendary monster -- like in Harry Potter -- where as soon as you see it you're cursed to turn to stone)
1 reply 0 retweets 14 likes - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.