so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics
Conversation
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
8
45
338
a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle
2
3
169
the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse
4
14
243
in retrospect when i was first exposed to these ideas (~2011) i was a tiny infant and i was not prepared to handle them in any real way. i was thinking about saving humanity when - please forgive the dramatic phrasing - i didn't even know what being human meant
1
2
190
one reason i've only talked about this indirectly until now is because in 2019 i wrote a 20-page google doc ranting about some version of this point of view and sent it to a bunch of rats and some of them were like "oh my god THANK YOU" and some of them got reeeeally angry
3
2
142
i didn't really have the social or emotional resources at the time to sustain any kind of fight so i gave up and stopped talking about it but i do still remember all the people who were like "oh my god THANK YOU" and while it's outdated in many ways i stand by a lot of that doc
1
1
104
the other reason i've only talked about this indirectly is that some of the dirt i have is confidential. but like. people talk to me about their feelings and some of those people talk to me about their feelings about other people in the ecosystem and so. now i know things
1
92
none of this, by the way, has any particular relevance to the importance of dealing with AGI as a problem. i've been actually spooked about this since alphago vs. lee sedol (before i was kinda LARPing it) and still spooked but not devoting a lot of attention to it specifically
4
2
128
I do wish you’d air this in some anonymized form that you’d feel comfortable with—the quality waterline for criticism of the rat community is so garbage that I think your take would probably do more good than harm
1
7
Replying to
that is probably true. this is the closest i've come to talking about it directly in years so baby steps
Man, people really don't understand enough about minds for their worries about AGI to be reasonable/rational.
A lot of suggested solutions and approaches to the "Problem" will likely end up causing the scenario they are trying to avoid...
1
1
But in regards to the rest of what you're saying, it's not too surprising. I want to an EA conference a few years ago and what i saw was pretty disappointing...


