Conversation

it was like my soul had accumulated gunk and had gotten obscured and the acid was cleansing it all away, dissolving it. i was a little afraid of how much i was dissolving. i had a hard time remembering what things i needed to hold onto, like knowing it was bad to pee myself
1
43
i took notes on the whole experience in workflowy as best as i could and i still have those notes. i was already a rationalist by then and in my notes i speculate on things like "huh maybe psychedelics would be good for rationality training, as a test to see what's really stuck"
Image
1
33
on this trip - i have no idea why i'm telling you this story - on this trip i became extremely convinced of a very specific belief i made up on the spot about what a soul was and what it meant to have one, and speculated (to their faces!) whether some of my friends had one
Image
1
3
54
this was a reaction i was having to a conversation we'd had earlier where they said they wouldn't care if they died in their sleep because they wouldn't be conscious to experience it. i found that vaguely horrifying and this was the only way i had to articulate that
1
1
44
which is to say, at the time the only way i knew how to articulate that i felt bad about something was to invent a philosophical position from which i could argue that that thing was inherently, intrinsically bad
1
5
102
the largest part of these notes wasn't even the thing about souls, it was this whole thing i wrote where i was trying to articulate what the acid was even doing to me "do not let go of: will to live, will not to hurt others"!!! i wrote that with my own hands!
Image
3
3
41
i barely remember being this person. i remember feeling delighted when i wrote all this at the time but looking back it's eerie to me how detached i sound from my own experience. i am so totally insistent on writing as if i am a scientist examining myself in a microscope
Image
2
53
"[I]f I think this objective stance is not only possible but desirable-- then what I am teaching him to do is to self-observation, I am training him to examine his own actions and thoughts as if he were a neutral person inside his own mind. But that other person would be me."
1
38
sometimes a person writes a sentence that just lodges itself permanently inside of me and that's one of them. that's one of the harpoons sticking out of this little baby right here. "but that other person would be me"
1
41
Replying to
some people understand immediately when i try to explain what it was like to be fully in the grip of the yudkowskian AI risk stuff and some people it doesn't seem to land at all, which is probably good for them and i wish i had been so lucky
2
15
90
when i came across lesswrong as a senior in college i was in some sense an empty husk waiting to be filled by something. i had not thought, ever, about who i was, what i wanted, what was important to me, what my values were, what was worth doing. just basic basic stuff
2
3
77
the Sequences were the single most interesting thing i'd ever read. eliezer yudkowsky was saying things that made more sense and captivated me more than i'd ever experienced. this is, iirc, where i was first exposed to the concept of a *cognitive bias*
5
5
57
i remember being horrified by the idea that my brain could be *systematically wrong* about something. i needed my brain a lot! i depended on it for everything! so whatever "cognitive biases" were, they were obviously the most important possible thing to understand
1
2
59
"but wait, isn't yud the AI guy? what's all this stuff about cognitive biases?" the reason this whole fucking thing exists is that yud tried to talk to people about AI and they disagreed with him and he concluded they were insane and needed to learn how to think better
2
5
108
so he wrote a ton of blog posts and organized them and put them on a website and started a whole little subculture whose goal was - as coy as everyone wanted to be about this - *thinking better because we were all insane and our insanity was going to kill us*
3
1
72
it would take me a long time to identify this as a kind of "original sin" meme. one of the most compelling things a cult can have is a story about why everyone else is insane / evil and why they are the only source of sanity / goodness
1
9
119
a cult needs you to stop trusting yourself. this isn't a statement about what any particular person wants. the cult itself, as its own aggregate entity, separate from its human hosts, in order to keep existing, needs you to stop trusting yourself
2
9
118
yud's writing was screaming to the rooftops in a very specific way: whatever you're doing by default, it's bad and wrong and you need to stop doing it and do something better hurry hurry you idiots we don't have time we don't have TIME we need to THINK
2
2
74
i had no defenses against something like this. i'd never encountered such a coherent memeplex laid out in so much excruciating detail, and - in retrospect - tailored so perfectly to invade my soul in particular. (he knew *math*! he explained *quantum mechanics* in the Sequences!)
2
79
an egg was laid inside of me and when it hatched the first song from its lips was a song of utter destruction, of the entire universe consumed in flames, because some careless humans hadn't thought hard enough before they summoned gods from the platonic deeps to do their bidding
2
6
96
yud has said quite explicitly in writing multiple times that as far as he's concerned AI safety and AI risk are *the only* important stories of our lifetimes, and everything else is noise in comparison. so what does that make me, in comparison? less than noise?
1
57
an "NPC" in the human story - unless, unless i could be persuaded to join the war in heaven myself? to personally contribute to the heroic effort to make artificial intelligence safe for everyone, forever, with *the entire lightcone* at stake and up for grabs?
1
1
57
i didn't think of myself as knowing or wanting to know anything about computer science or artificial intelligence, but eliezer didn't really talk in CS terms. he talked *math*. he wanted *proofs*. he wanted *provable safety*. and that was a language i deeply respected
1
46
yud wrote harry potter and the methods of rationality on purpose as a recruitment tool. he is explicit about this, and it worked. many very smart people very good at math were attracted into his orbit by what was in retrospect a masterful act of hyperspecific seduction
2
3
70
i, a poor fool unlucky in love, whose only enduring solace in life had been occasionally being good at math competitions, was being told that i could be a hero by being good at exactly the right kind of math. like i said, could not have been tailored better to hit me
2
97
i don't know how all this sounds to you but this was the air i breathed starting from before i graduated college. i have lived almost my entire adult life inside of this story, and later in the wreckage it formed as it slowly collapsed
1
1
72
the whole concept of an "infohazard" comes from lesswrong as far as i know. eliezer was very clear on the existence of *dangerous information*. already by the time i showed up on the scene there were taboos. *we did not speak of roko's basilisk*
3
67
(in retrospect another part of the cult attractor, the need to regulate the flow of information, who was allowed to know what when, who was smart enough to decide who was allowed to know what when, and so on and so on. i am still trying to undo some of this bullshit)
2
1
70
traditionally a cult tries to isolate you on purpose from everyone you ever knew before them but when it came to the rationalists i simply no longer found that i had anything to say to people from my old life. none of them had the *context* about what *mattered* to me anymore
2
2
84
i didn't even move to the bay on purpose to hang out with the rationalists; i went to UC berkeley for math grad school because i was excited about their research. the fact that eliezer yudkowsky would be *living in the same city as me* was just an absurd coincidence
1
34
the rationalists put on a meet-and-greet event at UC berkeley. i met anna salamon, then the director (iirc) of the Center for Applied Rationality, and talked to her. she invited me to their rationality workshop in january 2013. and from that point i was hooked
1
43
(it's so hard to tell this story. i've gone this whole time without telling you that during this entire period i was dealing with a breakup that, if i stood still too long, would completely overwhelm me with feelings i gradually learned how to constantly suppress)
3
75
(this is deeply embarrassing to admit but one reason i always found the simulation hypothesis strikingly plausible is that it would explain how i just happened to find myself hanging around who i considered to be plausibly the most important people in human history)
1
3
95
(because you'd think if our far future descendants were running ancestor simulations then they'd be especially likely to be running simulations of pivotal historical moments they wanted to learn more about, right? and what could be more pivotal than the birth of AI safety?)
2
67
god i feel like writing this all out is explaining something that's always felt weird to me about the whole concept of stories and science fiction stories in particular. *i have been living inside a science fiction story written by eliezer yudkowsky*
1
12
122
it didn't happen all at once like this, exactly. the whole memeplex sunk in over time through exposure. the more i drifted away from grad school the more my entire social life consisted of hanging out with other rationalists exclusively. their collective water was my water
1
49
it would take me years to learn how to articulate what was in the water. i would feel like i was going insane trying to explain how somehow hanging out with these dorks in the bay area made some part of me want to extinguish my entire soul in service of the greater good
1
2
56
i have been accused of focusing too much on feelings in general and my feelings in particular, or something like that. and what i want to convey here is that my feelings are the only thing that got me out of this extremely specific trap i had found myself in
1
3
107
i had to go back and relearn the most basic human capacities - the capacity to notice and say "i feel bad right now, this feels bad, i'm going to leave" - in order to fight *this*. in order to somehow tunnel out of this story into a better one that had room for more of me
3
8
116
Show replies