Conversation

it's worth acknowledging about the rationalists that they give lip service to empiricism but the Big Problem around which they nucleated, AI safety, was a problem where eliezer yudkowsky carefully explained that empiricism was impossible so theorizing was our only hope
Quote Tweet
Replying to @QiaochuYuan @m_ashcroft and @sashachapin
brb starting my own version of LessWrong and calling it WhatIfWeActuallyLookedAtTheTerritory
7
17
193
which is not to say that there aren't empiricists among the rationalists. all of my empiricism about psychology comes from hanging out with CFAR. but the core premise was always "thinking really hard will save the world" and the core group was people drawn to that premise
5
1
74
which includes me btw, we are ragging on past-me here also. "hey you know all that math you've devoted your life to? what if math could literally save the world" would've been nice if it had worked out tbh
1
1
66
structurally eliezer's explanation that The Problem That Will Doom Civilization lives in a remote theoretical future only accessible through mathematics and computer science bears nonzero resemblance to the common cult explanation that The Problem lives in the spirit realm
3
7
85
in the common cult construction you explain that 1) The Reason Everything Is Bad is because of evil spirits or something 2) Our Glorious Leader has a unique ability to fight off these spirits 3) perhaps you too, someday, can contribute to The Spirit War, if you Join Us
1
2
82
in rationality the spirit realm is Tegmark IV, the platonic realm where all mathematically possible structures live, including the most intelligent possible AIs and so forth. the spirit realm influences the material realm through e.g. the programming of computers
1
79
Replying to
i still don't know how true that is! but actually believing this has a gigantic impact on your psyche. everything warps around it. the possible destruction of civilization exerts tremendous psychological gravity. climate change can produce similar psychological effects
4
7
115
at some point in 2018 i was watching this tremendous psychological gravity totally fuck up a conversation the CFAR staff was trying to have and i became convinced that none of us were psychologically prepared to actually confront the possible destruction of civilization
4
80
in the common cult construction an important part of how it works is you say that Our Glorious Leader has a unique ability to commune with the spirits so your ability to verify for yourself what is happening in the spirit realm is restricted. you have to take their word for it
1
52
the restriction in rationality is not quite as tight but loosely it works out to "you have to know a certain minimum amount of math / CS to reason about what AIs can and can't do and if you can't do that you have to take other people's words for it who do"
2
1
54
this pitch has a specific effect on people who consider themselves to know enough math / CS to reason about what AIs can and can't do, and a very different effect on people who are... i don't have a nice way to say this... insecure about their intelligence
4
1
69
there's a pattern i've seen mostly women fall into in the rationalist scene, where it seems like they're looking for... how do i say this... epistemic daddies. father figures to tell them what is true and also validate them for being smart
12
34
143
that's a bit of a digression from my main point but it ties into this broader issue in rationalist epistemics around who is considered "smart" enough that you have to defer to their opinions, and the extent to which "intelligence" is framed as gating access to important truths
1
1
63
if you seriously buy the AI safety pitch *and* you feel insecure about your intelligence / your ability to do math you're in an uncomfortable position. the entire shape of your future is being dictated by forces you don't feel capable of understanding. wat do?
4
2
75
there's always been this huge divide in the rationalists i never really understood but this is helping me clarify it; it's between the rationalists who believe they can do math and the rationalists who don't. only the first group of rationalists has unfiltered access to Truth
4
3
82
and they end up functioning as a de facto priest class for the second group, who end up as kind of... hangers-on? groupies? it confused me for a long time (i didn't understand what the second group was getting out of this arrangement) but i think i kinda get it more now
3
1
75
anyway, in case it helps to hear me say it: anyone is capable of doing math. some of you have, for lack of a better word, "math trauma," and that trauma can be resolved, and you too can directly access the platonic realm, you don't have to take anyone else's word for it
6
14
164
wow i wandered through a bunch of topics there sorry apparently i had a lot to get off my chest. the difficulty of dealing with these dynamics is compounded by the fact that you're not allowed to talk about intelligence anymore. extremely annoying meta-dynamics tbh
3
1
60
and tbc by "smart" and "intelligent" in this thread i'm referring to a pretty specific thing, more of a social construction than an innate ability. some people give off "smart" and "intelligent" vibes and other people automatically defer to their opinions and so forth (it me)
4
1
50
mostly i don't notice it but when i do it often feels bad. people should defer to me less, i still mostly don't know anything. eliezer was onto something when he started intentionally cringeposting in the sequences about naruto and stuff. that was good shit
7
80
update: part of this discussion is locked but i have been prodded into clarifying that every time i say “rationalist” here i am referring to the bay area community centered around MIRI and CFAR specifically. there are other IRL communities and also the broader online community
21