It’s really hard to explain what’s going on with Effective Altruism without getting into (what many would consider) gossip. I lot of explanations for what can seem like “big decisions” seem to often come down to issues like:
-
Show this thread
-
“The person in charge of funding that particular area just had other things going on and couldn’t pay much attention.”
1 reply 0 retweets 32 likesShow this thread -
“This person doing this research really wanted to be in location X and wanted the job security of an Academic position, so changed their research direction accordingly.”
1 reply 0 retweets 30 likesShow this thread -
“This org hasn’t gotten funding because key person Y has been particularly obnoxious to several key people, and in that has caused a minor internal scandal or two.”
1 reply 0 retweets 32 likesShow this thread -
“Person Z has a really weird pet idea that no one else can understand, but also, no one is brave enough to publicly call them out on it.”
1 reply 0 retweets 31 likesShow this thread -
“X Domain is pretty bad because the orgs responsible for it for a long time were just run poorly.”
1 reply 0 retweets 25 likesShow this thread -
“X Domain isn’t happening because no one felt like dedicating their career to it.”
1 reply 0 retweets 25 likesShow this thread -
I’m not calling out EA as particularly bad, I think this is what one should expect from any sizable group made up of actual humans.
2 replies 1 retweet 55 likesShow this thread -
I run into a bunch of people who assume that the EA core is some type of agentic super-brain that makes all moves intentionally. So if something weird is going on, it must be for some eccentric reason of perfect wisdom.
2 replies 1 retweet 67 likesShow this thread
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.