Conversation

Replying to
I’m getting seriously mad at optimizer theology. That’s what got us into this mess. Specific ideas like lean/fat don’t kill. Mathematical techniques like optimization don’t kill. What kills is idiots fetishizing what they know over what they don’t. Optimizers incapable of doubt.
1
54
Doubt is not uncertainty or risk Doubt is not a probability estimate < 1 Doubt is not ambiguity Doubt is the capacity for living with a consciousness of true ignorance without anxiously covering it up Optimizer theology as opposed to the math of it is about removing doubt
2
33
Replying to
this rhymes a lot with ai-risk concerns, esp. corrigibility the "without anxiously covering it up" seems to be the hard part in that case transparency can only help so much, the fundamental problem is the desirability of a cover up
1
Replying to
Not a position I’m interested in elaborating. Would take too long. I think the rationalist/AI-risk community is deeply not-even-wrong about almost everything, but it’s not a view I debate or defend. takes it on and I’m loosely aligned with him.
1
4
OTOH, it’s like thinking about how to defend against hypothetical hostile aliens with FTL drives. Unless you have *some* idea about how an FTL drive might work, you really can’t get started. And we don’t have any plausible stories about how AGI might work either.
1
4