Laughter is really what they hate most Their natural element is trying to suck you into something as close to a formal debate as possible and waste your energy and time They can't stand just being callously dismissed with "Pfft nerd"
-
-
Replying to @arthur_affect @davidgerard and
That's the Eliezer Yudkowsky quote the name "SneerClub" comes from in the first place They get so fucking mad at the idea of being dismissed based on emotional contempt without even bothering to read their collected reams of evidence Only Chads and Stacies do that
1 reply 5 retweets 39 likes -
Replying to @arthur_affect @davidgerard and
It's why popularizing the meme of Roko's Basilisk has been 10,000x more helpful in fighting their bullshit than any long point by point dissection from someone like me could be
2 replies 4 retweets 48 likes -
This Tweet is unavailable.
-
It's an inside reference Basically, the group of Silicon Valley homebrew philosophers who hang out at websites like Less Wrong and Slate Star Codex are a lot like a cult One of them came up with this very complicated idea that's very culty https://rationalwiki.org/wiki/Roko%27s_basilisk …
2 replies 3 retweets 28 likes -
Replying to @arthur_affect @merrickdeville and
Without getting too deep, a lot of these guys are really obsessed with AI research and the idea that if you build a computer that's smart enough to improve itself, rather than relying on humans to improve it, its intelligence will grow exponentially until it can do anything
3 replies 1 retweet 18 likes -
Replying to @arthur_affect @merrickdeville and
They are really really into this idea that someday in the future an AI will be built that will become God, essentially, and they call this event the Singularity (it'll be an event that's never happened before and the results are impossible to predict)
3 replies 0 retweets 18 likes -
Replying to @arthur_affect @merrickdeville and
The singularity in particular is a hilarious idea. Exponential growth ends in one of two ways: 1) A sinusoid, when damping factors eventually kick in, 2) the machines explodes, if damping factors do not exist or are not sufficiently robust. 1/
3 replies 2 retweets 16 likes -
Replying to @amolitor99 @merrickdeville and
The idea of a technological ceiling - maybe some things are just impossible no matter how smart you are, so maybe the world's smartest computer wouldn't be all that godlike at all - seems to make way more sense than assuming otherwise But they get so offended at it
1 reply 0 retweets 2 likes -
Replying to @arthur_affect @amolitor99 and
It seems likely that godlike AI is our generation’s version of psionics — the excuse for putting fantasy elements into otherwise-rigorous science fiction.
1 reply 1 retweet 9 likes
Much like believing in psychic powers (a la Roald Dahl's Matilda) it's a core fantasy for the Silicon Valley crowd that "intelligence" means this universal ability to solve problems and that once it passes a certain threshold it's literally magic
-
-
Replying to @arthur_affect @avram and
Like why all the nerds wanted to play Wizards in D&D (and then graduated from that to playing Mage: the Ascension, which continues to be a "rationalist community" obsession and inspired the name of their Discord server)
2 replies 0 retweets 3 likes -
Replying to @arthur_affect @avram and
"If I just think about things hard enough I can make whatever I want happen The resource limitations that exist in the real physical world are just a temporary barrier that will fall to the power of genius"
0 replies 0 retweets 4 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.