Conversation

so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics
1
7
145
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
8
45
338
a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle
2
3
169
the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse
4
14
243
in retrospect when i was first exposed to these ideas (~2011) i was a tiny infant and i was not prepared to handle them in any real way. i was thinking about saving humanity when - please forgive the dramatic phrasing - i didn't even know what being human meant
1
2
190
one reason i've only talked about this indirectly until now is because in 2019 i wrote a 20-page google doc ranting about some version of this point of view and sent it to a bunch of rats and some of them were like "oh my god THANK YOU" and some of them got reeeeally angry
3
2
142
i didn't really have the social or emotional resources at the time to sustain any kind of fight so i gave up and stopped talking about it but i do still remember all the people who were like "oh my god THANK YOU" and while it's outdated in many ways i stand by a lot of that doc
1
1
104
the other reason i've only talked about this indirectly is that some of the dirt i have is confidential. but like. people talk to me about their feelings and some of those people talk to me about their feelings about other people in the ecosystem and so. now i know things
1
92
Replying to
i was not involved with this particular group but various dynamics related to what i obliquely described here are becoming gradually more public
Quote Tweet
Leverage was/is a "high-demand group" in my social circles in the bay area; if you're familiar with it, here's a pretty restrained description of the more undisputable, common-knowledge features of this group: lesswrong.com/posts/Kz9zMgWB
1
17
Replying to
I do wish you’d air this in some anonymized form that you’d feel comfortable with—the quality waterline for criticism of the rat community is so garbage that I think your take would probably do more good than harm
1
7
Show replies
1. the inability of rats to understand certain perspectives (body/tension) etc is infuriating and make them bad at doing their own goals and connecting with people who do understand it.
1
Show replies
Replying to
It's kind of weird reading an account from someone who got so emotionally invested. I feel like I always defaulted to "wow, that's a really serious issue and I hope it gets some more attention and resources... ok back to everyday life" GPT-3 still spooks me. not in a ...
1
... not in a "it's going to take over the world" way, but rather .... erm, I'm reasonably sure it's not going all "I think therefore I am.... buuuuut it worries me that I can have coherent philosophical dialogue with it over whether it is itself conscious..."