This. The EA orientation towards the felt experience of suffering and, by extension, compassion is so fucked. The president of Stanford EA told me he'd noticed other EAs feeling cold + calculating where he felt warm + compassionate, and wondered if he should be colder.https://twitter.com/Malcolm_Ocean/status/1116967983539269632 …
-
Show this thread
I told him it was deeply important to continue feeling compassion and that IMO the other EAs were doing it wrong. Could be phrased as the "human alignment problem": without access to compassion how do you know what you're doing has anything to do with human values?
1:02 AM - 13 Apr 2019
0 replies
0 retweets
7 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.