Re this: applieddivinitystudies.com/2020/09/05/rat
One of my secret blog goals with Commonplace is to answer the question “what would a rationality blog that’s ACTUALLY focused on instrumental rationality look like?”
Conversation
I think the core difference with LW-type rationality is: “The world is not insane. If you think the world is insane, then your model of the world is wrong and you won’t be effective.”
But of course, I don’t allow myself to write about AI-safety, so my job is a lot easier.
1
2
The alternative frame I've picked is: "The world is a complex adaptive system. Like all CASs, there are simple rules at the bottom. You can figure out those rules by observation, and verify them through action. If you do this, you will win."
1
3
I've tried to keep to the standard of intellectual rigour of the best rationality blogs. I HAVE been hugely influenced by LessWrong and its writers.
But I think their approach is fundamentally wrong: instrumental rationality doesn't demand epistemic correctness.
1
5
(Also related: communities that are obsessed with epistemic correctness will eventually devolve into a place where people sit around, debating the finer points of some idea, never doing anything, and therefore never accomplishing anything.)
Replying to
Counter arguments to this:
- all of science
- the Effective Altruism movement (?)
2
Replying to
I re-read this article once in a while, or when I feel I lack direction or I think I'm stuck in analysis-paralysis.
mindingourway.com/dive-in/
1
1
4
"What do the odds have to do with your ability to commit? Why is their epistemic state preventing them from entering the emotional state that would most help them succeed?"
1

