Re this: applieddivinitystudies.com/2020/09/05/rat
One of my secret blog goals with Commonplace is to answer the question “what would a rationality blog that’s ACTUALLY focused on instrumental rationality look like?”
Conversation
I think the core difference with LW-type rationality is: “The world is not insane. If you think the world is insane, then your model of the world is wrong and you won’t be effective.”
But of course, I don’t allow myself to write about AI-safety, so my job is a lot easier.
1
2
The alternative frame I've picked is: "The world is a complex adaptive system. Like all CASs, there are simple rules at the bottom. You can figure out those rules by observation, and verify them through action. If you do this, you will win."
1
3
I've tried to keep to the standard of intellectual rigour of the best rationality blogs. I HAVE been hugely influenced by LessWrong and its writers.
But I think their approach is fundamentally wrong: instrumental rationality doesn't demand epistemic correctness.
Replying to
(Also related: communities that are obsessed with epistemic correctness will eventually devolve into a place where people sit around, debating the finer points of some idea, never doing anything, and therefore never accomplishing anything.)
2
1
7
Counter arguments to this:
- all of science
- the Effective Altruism movement (?)
2
