Conversation

one of the most unpleasant things i learned about myself from my years among the rationalists was that i basically had two completely separate sets of "beliefs," one of which i could actually act on and one of which was almost purely for signaling games
5
16
298
this was common enough that we had terminology for it - "aliefs" were the ones you "really believed" and "beliefs" were uhhh the others someone once said "there are beliefs... and then there are the things people say" which i also like as an alternative
3
5
102
when i started getting into talking about rationality and AI risk and all that fun stuff i was, tbh, not really treating it as if it could have actual real-life consequences. for me it was almost purely an intellectual game which i enjoyed getting attention for being good at
1
1
65
the set of "beliefs" i was using for rat signaling games felt like a natural extension of "schoolbrain." like i had been trained very well in the art of bullshitting plausible-sounding opinions about things i knew less than nothing about, for years
1
1
89
that was annoying enough when i was just using it to impress people, but when i started turning schoolbrain on *myself*? that's when "akrasia" hit - my thoughts were full of "convincing" arguments about why i should do X or shouldn't do Y, meanwhile my body was like "nah tho"
2
2
76
gradually i came to suspect that something was very wrong about how i was orienting towards "beliefs." i got into all the feelings stuff (which rats introduced me to!) in part to explore my "actual" "beliefs," which it turns out were and are often insane
2
73
i don't mean this in a derogatory way, i mean they were held by young child parts disconnected from my adult reality. stuff like "if i make a woman angry at me i'll literally die," that sort of thing. "obviously false" but nevertheless running my life from the shadows
2
78
once i had learned enough about feelings stuff to get a sense of where my "actual" "beliefs" were - the ones connected to my body, to motivation, to desire, to action - whatever the rats were doing instead began to seem... like not what *i* needed, anyway
1
56
(worth mentioning here that, to their credit, many fine upstanding rationalists seemed and seem quite capable of coming to novel conclusions on the basis of systematic rational thought and then acting on them. the pandemic has really let some of them shine)
2
1
51
i went through a long period of insisting at every rationality workshop that we focus on feelings stuff almost exclusively. i probably annoyed a bunch of people but in retrospect i pretty much nailed what *i* needed, at least; hopefully it helped a few other people too
1
38
but hey, things are looking up. recently i realized i'd been lying to myself and an ex about something really important about how we got together and why we broke up. it felt like a huge relief to admit it to myself (and her!). it only took 5 years 😅
1
1
37