We would like to hand the torch to @GeorgeHemingto1 and @davidgerard who will continue the series of unprecedented exposés. What WILL we find? (Hint: it’s racism)
Aka I need to sleep but one of u create a thread and link it here
-
-
Replying to @ArsonAtDennys @RiotAtArbys and
although I deeply appreciate your top class work here, you must understand that we are both profoundly lazy and tend to approach critique of the rationalist subculture as internet television, including occasionally the bit where you shout at the screen
2 replies 3 retweets 53 likes -
Replying to @davidgerard @RiotAtArbys and
Laughter is really what they hate most Their natural element is trying to suck you into something as close to a formal debate as possible and waste your energy and time They can't stand just being callously dismissed with "Pfft nerd"
5 replies 7 retweets 93 likes -
Replying to @arthur_affect @davidgerard and
That's the Eliezer Yudkowsky quote the name "SneerClub" comes from in the first place They get so fucking mad at the idea of being dismissed based on emotional contempt without even bothering to read their collected reams of evidence Only Chads and Stacies do that
1 reply 5 retweets 39 likes -
Replying to @arthur_affect @davidgerard and
It's why popularizing the meme of Roko's Basilisk has been 10,000x more helpful in fighting their bullshit than any long point by point dissection from someone like me could be
2 replies 4 retweets 48 likes -
This Tweet is unavailable.
-
It's an inside reference Basically, the group of Silicon Valley homebrew philosophers who hang out at websites like Less Wrong and Slate Star Codex are a lot like a cult One of them came up with this very complicated idea that's very culty https://rationalwiki.org/wiki/Roko%27s_basilisk …
2 replies 3 retweets 28 likes -
Replying to @arthur_affect @merrickdeville and
Without getting too deep, a lot of these guys are really obsessed with AI research and the idea that if you build a computer that's smart enough to improve itself, rather than relying on humans to improve it, its intelligence will grow exponentially until it can do anything
3 replies 1 retweet 18 likes -
Replying to @arthur_affect @merrickdeville and
They are really really into this idea that someday in the future an AI will be built that will become God, essentially, and they call this event the Singularity (it'll be an event that's never happened before and the results are impossible to predict)
3 replies 0 retweets 18 likes -
Replying to @arthur_affect @merrickdeville and
The singularity in particular is a hilarious idea. Exponential growth ends in one of two ways: 1) A sinusoid, when damping factors eventually kick in, 2) the machines explodes, if damping factors do not exist or are not sufficiently robust. 1/
3 replies 2 retweets 16 likes
The idea of a technological ceiling - maybe some things are just impossible no matter how smart you are, so maybe the world's smartest computer wouldn't be all that godlike at all - seems to make way more sense than assuming otherwise But they get so offended at it
-
-
Replying to @arthur_affect @amolitor99 and
It seems likely that godlike AI is our generation’s version of psionics — the excuse for putting fantasy elements into otherwise-rigorous science fiction.
1 reply 1 retweet 9 likes -
Replying to @avram @amolitor99 and
Much like believing in psychic powers (a la Roald Dahl's Matilda) it's a core fantasy for the Silicon Valley crowd that "intelligence" means this universal ability to solve problems and that once it passes a certain threshold it's literally magic
2 replies 3 retweets 10 likes - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.