Has anyone named this effect? I have seen it implicitly discussed but not explicitly. Human capability degrading faster than automation capability increases, creating a transient death-trap period where risks are very high. I call it the automation death trap effect.
Conversation
People assume human abilities stay constant while automation is improving. They don’t. The general degradation starts the moment you begin trusting the automation for *anything* because unused or rarely used skills degrade at the rate reinforcement schedule slows.
5
7
35
It’s been a long-standing debate in control theory and AI, but based on a false assumption: that such a thing as unmaintained emergency override skills can exist. Ie that humans deploy behavior X *only* in the emergencies where the automation fails.
There is no such thing.
1
2
17
Replying to
Driverless cars are the most familiar example. But any high-risk behavior that requires human backstopping of automation qualifies for this death-trap effect. I think a lot of infrastructure is either in, or getting close to, the death trap zone.
2
3
25
This is a much more serious and real risk than AGI/singularity BS. Too much infrastructure being in the death-trap state at the same time due to correlated automation wave causing systemic collapses via contagion and 2nd order effects as failures cascade.
3
4
23
Replying to
The military trains constantly on loss-of-technology scenarios for their own infrastructure. Are you sure that civilians don’t?
1
Replying to
I'm fairly sure they don't. Individuals don't have capacity or discipline, and civilian bureaucracies and corporations seem to be all headed for the death trap state.
1

