i had to go back and relearn the most basic human capacities - the capacity to notice and say "i feel bad right now, this feels bad, i'm going to leave" - in order to fight *this*. in order to somehow tunnel out of this story into a better one that had room for more of me
Conversation
a fun fact about the rationality and effective altruism communities is that they attract a lot of ex-evangelicals. they have this whole thing about losing their faith but still retaining all of the guilt and sin machinery looking for something more... rational... to latch onto
6
17
216
(that's a very irresponsible psychoanalysis of a whole bunch of people i just did there but i've asked some of them about stuff like this, enough that i think i'm not completely making this up)
4
1
72
i really feel like i get it though. i too now also find that once i've had a taste of what it's like to feel cosmically significant i don't want to give it up. i don't know how to live completely outside a story. i've never had to. i just want a better one and i'm still looking
8
3
108
i was really really hoping i would never have to think about AI ever again, y'know, after all this. seeing AI discourse turn up here was like running into an ex i was hoping never to see again
1
1
77
leaving the rationalists was on some level one of the hardest things i've ever done. it was like breaking up with someone in a world where you'd never heard anyone even describe a romantic relationship to you before. i had so little context to understand what had happened to me
2
2
74
i'm finally getting around to reading your leverage post, i was too scared to read it when it first came out. thank you for writing this 🙏 i don't think any of my experiences were near this intense but there's a family resemblance
4
1
50
Quote Tweet
(this is deeply embarrassing to admit but one reason i always found the simulation hypothesis strikingly plausible is that it would explain how i just happened to find myself hanging around who i considered to be plausibly the most important people in human history)
Show this thread
2
1
53
i had very little direct interaction with leverage but i knew that they were around. geoff anders taught at my first CFAR workshop. at one point i signed up for belief reporting sessions and signed an NDA saying i wasn't allowed to teach belief reporting
1
39
at some point i'm gonna actually talk about what it was like to work at CFAR. it was nowhere near as bad as this but we did circle semi-regularly and that periodic injection of psychological vulnerability did really weird things in retrospect
1
1
68
oh my god i need to go to sleep but if anyone happened to actually read this whole thing thank you for your time and i hope you've enjoyed learning a little more about why i'm completely fucking insane
Replying to
Me, should be sleeping, instead wide awake thinking about other people’s brains in a way that benefits me not at all: “gosh this QC guy sure is nuts bless his heart”
14
Replying to
I did read it, I have some things to add but I'm too tired to not sound rambling atm (I started replying but it was too rambling so I deleted lol)
Replying to
Have you ever written up in full detail why you left? Some details here, but not some key parts, ex: do you no longer think AI alignment is as dangerous as EY claims? It could be true that alignment is as important as ever, while embodying that fact may be psychologically harmful
2
9
the ex rat stories have a dark nightmarish tone which I’m sure is true to their emotional experiences but the root problems seems to come down to “I was too obsessed and that hurt my mental health”
1
1
7
Show replies
Replying to
<3 thanks for sharing all this. I just wanna say if writing out this story did ever seem feasible, healthy, and valuable for you, I think it would be so good for others to read
2
Replying to
As another ex-rat who was deeply fucked with on multiple levels by the rationalist memeplex, I appreciate your writing this, it's grounding to know others have had such similar experiences to me.
1
8









