(it's so hard to tell this story. i've gone this whole time without telling you that during this entire period i was dealing with a breakup that, if i stood still too long, would completely overwhelm me with feelings i gradually learned how to constantly suppress)
Conversation
(this is deeply embarrassing to admit but one reason i always found the simulation hypothesis strikingly plausible is that it would explain how i just happened to find myself hanging around who i considered to be plausibly the most important people in human history)
1
3
95
(because you'd think if our far future descendants were running ancestor simulations then they'd be especially likely to be running simulations of pivotal historical moments they wanted to learn more about, right? and what could be more pivotal than the birth of AI safety?)
2
67
god i feel like writing this all out is explaining something that's always felt weird to me about the whole concept of stories and science fiction stories in particular. *i have been living inside a science fiction story written by eliezer yudkowsky*
1
12
122
it didn't happen all at once like this, exactly. the whole memeplex sunk in over time through exposure. the more i drifted away from grad school the more my entire social life consisted of hanging out with other rationalists exclusively. their collective water was my water
1
49
it would take me years to learn how to articulate what was in the water. i would feel like i was going insane trying to explain how somehow hanging out with these dorks in the bay area made some part of me want to extinguish my entire soul in service of the greater good
1
2
56
i have been accused of focusing too much on feelings in general and my feelings in particular, or something like that. and what i want to convey here is that my feelings are the only thing that got me out of this extremely specific trap i had found myself in
1
3
107
i had to go back and relearn the most basic human capacities - the capacity to notice and say "i feel bad right now, this feels bad, i'm going to leave" - in order to fight *this*. in order to somehow tunnel out of this story into a better one that had room for more of me
3
8
116
a fun fact about the rationality and effective altruism communities is that they attract a lot of ex-evangelicals. they have this whole thing about losing their faith but still retaining all of the guilt and sin machinery looking for something more... rational... to latch onto
6
17
216
(that's a very irresponsible psychoanalysis of a whole bunch of people i just did there but i've asked some of them about stuff like this, enough that i think i'm not completely making this up)
4
1
72
i really feel like i get it though. i too now also find that once i've had a taste of what it's like to feel cosmically significant i don't want to give it up. i don't know how to live completely outside a story. i've never had to. i just want a better one and i'm still looking
Replying to
i was really really hoping i would never have to think about AI ever again, y'know, after all this. seeing AI discourse turn up here was like running into an ex i was hoping never to see again
1
1
77
leaving the rationalists was on some level one of the hardest things i've ever done. it was like breaking up with someone in a world where you'd never heard anyone even describe a romantic relationship to you before. i had so little context to understand what had happened to me
2
2
74
i'm finally getting around to reading your leverage post, i was too scared to read it when it first came out. thank you for writing this 🙏 i don't think any of my experiences were near this intense but there's a family resemblance
4
1
50
Quote Tweet
(this is deeply embarrassing to admit but one reason i always found the simulation hypothesis strikingly plausible is that it would explain how i just happened to find myself hanging around who i considered to be plausibly the most important people in human history)
Show this thread
2
1
53
i had very little direct interaction with leverage but i knew that they were around. geoff anders taught at my first CFAR workshop. at one point i signed up for belief reporting sessions and signed an NDA saying i wasn't allowed to teach belief reporting
1
39
at some point i'm gonna actually talk about what it was like to work at CFAR. it was nowhere near as bad as this but we did circle semi-regularly and that periodic injection of psychological vulnerability did really weird things in retrospect
1
1
68
oh my god i need to go to sleep but if anyone happened to actually read this whole thing thank you for your time and i hope you've enjoyed learning a little more about why i'm completely fucking insane
31
237
Replying to
angelpilled
IMO it's possible to find your own purpose, or break out, but breaking out feels like some kind of loss and defeat that won't return (it often will, but the feeling sucks)
8
Replying to
i feel like this story is super on the nose for what lesswrong is about/what it does. should be renamed "the sword of trauma".
yudkowsky.net/other/fiction/
1
2
5
the fiction was always the clearest about what was actually going on
4


