Conversation

one of the most unpleasant things i learned about myself from my years among the rationalists was that i basically had two completely separate sets of "beliefs," one of which i could actually act on and one of which was almost purely for signaling games
5
16
298
this was common enough that we had terminology for it - "aliefs" were the ones you "really believed" and "beliefs" were uhhh the others someone once said "there are beliefs... and then there are the things people say" which i also like as an alternative
3
5
102
when i started getting into talking about rationality and AI risk and all that fun stuff i was, tbh, not really treating it as if it could have actual real-life consequences. for me it was almost purely an intellectual game which i enjoyed getting attention for being good at
1
1
65
the set of "beliefs" i was using for rat signaling games felt like a natural extension of "schoolbrain." like i had been trained very well in the art of bullshitting plausible-sounding opinions about things i knew less than nothing about, for years
1
1
89
that was annoying enough when i was just using it to impress people, but when i started turning schoolbrain on *myself*? that's when "akrasia" hit - my thoughts were full of "convincing" arguments about why i should do X or shouldn't do Y, meanwhile my body was like "nah tho"
2
2
76
gradually i came to suspect that something was very wrong about how i was orienting towards "beliefs." i got into all the feelings stuff (which rats introduced me to!) in part to explore my "actual" "beliefs," which it turns out were and are often insane
2
73
Show replies