Conversation

Replying to
I think the core difference with LW-type rationality is: “The world is not insane. If you think the world is insane, then your model of the world is wrong and you won’t be effective.” But of course, I don’t allow myself to write about AI-safety, so my job is a lot easier.
1
2
The alternative frame I've picked is: "The world is a complex adaptive system. Like all CASs, there are simple rules at the bottom. You can figure out those rules by observation, and verify them through action. If you do this, you will win."
1
3
I've tried to keep to the standard of intellectual rigour of the best rationality blogs. I HAVE been hugely influenced by LessWrong and its writers. But I think their approach is fundamentally wrong: instrumental rationality doesn't demand epistemic correctness.
1
5
(Also related: communities that are obsessed with epistemic correctness will eventually devolve into a place where people sit around, debating the finer points of some idea, never doing anything, and therefore never accomplishing anything.)
2
7
Replying to
Great thread here:
Quote Tweet
Q: "Where are All the Successful Rationalists?" Personal success is determined by conscientiousness more than anything else. (Assuming you have decent IQ, like 115 or so). If you are determinied to keep trying, to stick at stuff etc, you usually win. applieddivinitystudies.com/2020/09/05/rat
Show this thread
1
1
Replying to
Beautiful. I remember reading this in the past. This was a good reminder. (Am thinking of going deep into the literature on Luck vs Skill sometime in the next few weeks).
1
1
Show replies