Conversation

in retrospect maybe a memetic advantage i don't talk or think much about having is that i was AGI-pilled for long enough that i know what having a totalizing narrative about the most important thing feels like from the inside and i know what it feels like to step out of it
17
39
389
so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics
1
7
145
Replying to
a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle
2
3
169
the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse
4
14
243
in retrospect when i was first exposed to these ideas (~2011) i was a tiny infant and i was not prepared to handle them in any real way. i was thinking about saving humanity when - please forgive the dramatic phrasing - i didn't even know what being human meant
1
2
190
one reason i've only talked about this indirectly until now is because in 2019 i wrote a 20-page google doc ranting about some version of this point of view and sent it to a bunch of rats and some of them were like "oh my god THANK YOU" and some of them got reeeeally angry
3
2
142
i didn't really have the social or emotional resources at the time to sustain any kind of fight so i gave up and stopped talking about it but i do still remember all the people who were like "oh my god THANK YOU" and while it's outdated in many ways i stand by a lot of that doc
1
1
104
the other reason i've only talked about this indirectly is that some of the dirt i have is confidential. but like. people talk to me about their feelings and some of those people talk to me about their feelings about other people in the ecosystem and so. now i know things
1
92
none of this, by the way, has any particular relevance to the importance of dealing with AGI as a problem. i've been actually spooked about this since alphago vs. lee sedol (before i was kinda LARPing it) and still spooked but not devoting a lot of attention to it specifically
4
2
128
i was not involved with this particular group but various dynamics related to what i obliquely described here are becoming gradually more public
Quote Tweet
Leverage was/is a "high-demand group" in my social circles in the bay area; if you're familiar with it, here's a pretty restrained description of the more undisputable, common-knowledge features of this group: lesswrong.com/posts/Kz9zMgWB
1
17