in rationality the spirit realm is Tegmark IV, the platonic realm where all mathematically possible structures live, including the most intelligent possible AIs and so forth. the spirit realm influences the material realm through e.g. the programming of computers
Conversation
the central AI safety pitch is that there are horrible eldritch monsters living in the spirit realm (mathematically possible AIs) and irresponsible magicians (programmers) trying their damndest to summon them and this is the central problem of civilization
3
29
168
i still don't know how true that is! but actually believing this has a gigantic impact on your psyche. everything warps around it. the possible destruction of civilization exerts tremendous psychological gravity. climate change can produce similar psychological effects
4
7
115
at some point in 2018 i was watching this tremendous psychological gravity totally fuck up a conversation the CFAR staff was trying to have and i became convinced that none of us were psychologically prepared to actually confront the possible destruction of civilization
4
80
oh i forgot how much of this i had written already, previous thread
Quote Tweet
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
Show this thread
1
2
52
in the common cult construction an important part of how it works is you say that Our Glorious Leader has a unique ability to commune with the spirits so your ability to verify for yourself what is happening in the spirit realm is restricted. you have to take their word for it
1
52
the restriction in rationality is not quite as tight but loosely it works out to "you have to know a certain minimum amount of math / CS to reason about what AIs can and can't do and if you can't do that you have to take other people's words for it who do"
2
1
54
this pitch has a specific effect on people who consider themselves to know enough math / CS to reason about what AIs can and can't do, and a very different effect on people who are... i don't have a nice way to say this... insecure about their intelligence
4
1
69
there's a pattern i've seen mostly women fall into in the rationalist scene, where it seems like they're looking for... how do i say this... epistemic daddies. father figures to tell them what is true and also validate them for being smart
12
34
143
that's a bit of a digression from my main point but it ties into this broader issue in rationalist epistemics around who is considered "smart" enough that you have to defer to their opinions, and the extent to which "intelligence" is framed as gating access to important truths
1
1
63
if you seriously buy the AI safety pitch *and* you feel insecure about your intelligence / your ability to do math you're in an uncomfortable position. the entire shape of your future is being dictated by forces you don't feel capable of understanding. wat do?
Replying to
there's always been this huge divide in the rationalists i never really understood but this is helping me clarify it; it's between the rationalists who believe they can do math and the rationalists who don't. only the first group of rationalists has unfiltered access to Truth
4
3
82
and they end up functioning as a de facto priest class for the second group, who end up as kind of... hangers-on? groupies? it confused me for a long time (i didn't understand what the second group was getting out of this arrangement) but i think i kinda get it more now
3
1
75
anyway, in case it helps to hear me say it: anyone is capable of doing math. some of you have, for lack of a better word, "math trauma," and that trauma can be resolved, and you too can directly access the platonic realm, you don't have to take anyone else's word for it
6
14
164
wow i wandered through a bunch of topics there sorry apparently i had a lot to get off my chest. the difficulty of dealing with these dynamics is compounded by the fact that you're not allowed to talk about intelligence anymore. extremely annoying meta-dynamics tbh
3
1
60
and tbc by "smart" and "intelligent" in this thread i'm referring to a pretty specific thing, more of a social construction than an innate ability. some people give off "smart" and "intelligent" vibes and other people automatically defer to their opinions and so forth (it me)
4
1
50
mostly i don't notice it but when i do it often feels bad. people should defer to me less, i still mostly don't know anything. eliezer was onto something when he started intentionally cringeposting in the sequences about naruto and stuff. that was good shit
7
80
update: part of this discussion is locked but i have been prodded into clarifying that every time i say “rationalist” here i am referring to the bay area community centered around MIRI and CFAR specifically. there are other IRL communities and also the broader online community
21
Replying to
Isn’t this true for any time anyone is insecure about their intelligence and there is any perceived large threat? Ai, bio weapons, nuclear war, famines, pandemics, general economic activity, 2nd amendment rights, etc.
1
Replying to
“the entire shape of your future is being dictated by forces you don't feel capable of understanding”
This is simply the correct view to take. Some people manage to live with that uncertainty, some have to delude themselves into believeing they actually have it all figured out.
3



