Conversation

i haven’t used the c-word yet but ultimately it’s a mild variation on normal cult dynamics. it’s really easy in the bay area rat scene to make your entire social world rats and from there it’s easy to feel unconsciously like social acceptance depends on continuing to buy in
1
32
one experience that sticks out in particular was going to a really lovely authentic relating event in austin, feeling happier than i had in months, like just dancing and singing for joy and then noticing i was dreading going back to the bay
2
37
you’re describing this as a social/emotional phenomenon to be clear, did you at some point believe that AI safety was a serious concern and no longer believe that? or did your beliefs not shift as part of this process?
2
30
yeah I went thru a sorta similar arc, & I still think it's a huge concern I'm glad MIRI is doing their thing I'm glad other researchers are I'm investing my time elsewhere in meta stuff that will help eventually but I'm definitely making a bet nothing explodes in the meantime
1
10
you also don’t need to help or justify whatever you’re doing as a roundabout long term way to help (if you don’t want to). Took me a long time to get this
3
16
The Power of Now guy is like “i work on this because we have nuclear arsenals that need to be handled well, and that requires spiritual enlightenment” I’m like yo if you want to work on nuke safety do that, or do whatever else you want. No need to justify
2
12
Quote Tweet
You don’t have to be effective and you don’t have to be an altruist. You don’t need a logical or ethical framework. You don’t need to maximize impact or minimize regret You can do whatever you want forever for no reason
Show this thread
1
9
I'm still trying to recover from the damage being EA-pilled did to me. Lots of unprocessed anger there, not sure where to start
4
11