in rationality the spirit realm is Tegmark IV, the platonic realm where all mathematically possible structures live, including the most intelligent possible AIs and so forth. the spirit realm influences the material realm through e.g. the programming of computers
Conversation
the central AI safety pitch is that there are horrible eldritch monsters living in the spirit realm (mathematically possible AIs) and irresponsible magicians (programmers) trying their damndest to summon them and this is the central problem of civilization
3
29
168
i still don't know how true that is! but actually believing this has a gigantic impact on your psyche. everything warps around it. the possible destruction of civilization exerts tremendous psychological gravity. climate change can produce similar psychological effects
4
7
115
at some point in 2018 i was watching this tremendous psychological gravity totally fuck up a conversation the CFAR staff was trying to have and i became convinced that none of us were psychologically prepared to actually confront the possible destruction of civilization
4
80
oh i forgot how much of this i had written already, previous thread
Quote Tweet
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
Show this thread
1
2
52
in the common cult construction an important part of how it works is you say that Our Glorious Leader has a unique ability to commune with the spirits so your ability to verify for yourself what is happening in the spirit realm is restricted. you have to take their word for it
1
52
the restriction in rationality is not quite as tight but loosely it works out to "you have to know a certain minimum amount of math / CS to reason about what AIs can and can't do and if you can't do that you have to take other people's words for it who do"
2
1
54
this pitch has a specific effect on people who consider themselves to know enough math / CS to reason about what AIs can and can't do, and a very different effect on people who are... i don't have a nice way to say this... insecure about their intelligence
4
1
69
there's a pattern i've seen mostly women fall into in the rationalist scene, where it seems like they're looking for... how do i say this... epistemic daddies. father figures to tell them what is true and also validate them for being smart
12
34
143
gah yes this is the insecure nerd girl trap and this is such a good way of putting it
2
40
Replying to
i saw variations of it a couple times and was mostly confused but hearing you talk about your version of it was immensely clarifying so thank you 😅

