Conversation

so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics
1
7
145
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
8
45
338
a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle
2
3
169
the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse
4
14
243
in retrospect when i was first exposed to these ideas (~2011) i was a tiny infant and i was not prepared to handle them in any real way. i was thinking about saving humanity when - please forgive the dramatic phrasing - i didn't even know what being human meant
1
2
190
one reason i've only talked about this indirectly until now is because in 2019 i wrote a 20-page google doc ranting about some version of this point of view and sent it to a bunch of rats and some of them were like "oh my god THANK YOU" and some of them got reeeeally angry
3
2
142
Show replies