Conversation

in retrospect maybe a memetic advantage i don't talk or think much about having is that i was AGI-pilled for long enough that i know what having a totalizing narrative about the most important thing feels like from the inside and i know what it feels like to step out of it
17
39
389
so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics
1
7
145
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
8
45
338
a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle
2
3
169
the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse
4
14
243
I’ve been concerned about this for many years, but have been distant enough and busy enough to feel it’s not really my problem, but also somehow something I should try to help with. Have nearly written about it a few times. Maybe writing isn’t what’s needed though
2
15