Has anyone named this effect? I have seen it implicitly discussed but not explicitly. Human capability degrading faster than automation capability increases, creating a transient death-trap period where risks are very high. I call it the automation death trap effect.
Conversation
People assume human abilities stay constant while automation is improving. They don’t. The general degradation starts the moment you begin trusting the automation for *anything* because unused or rarely used skills degrade at the rate reinforcement schedule slows.
5
7
35
It’s been a long-standing debate in control theory and AI, but based on a false assumption: that such a thing as unmaintained emergency override skills can exist. Ie that humans deploy behavior X *only* in the emergencies where the automation fails.
There is no such thing.
1
2
17
It is somewhere between malpractice and technical stupidity to base a design on this assumption. When the rate of use of a skill is lower than the rate of practice needed to keep it usable, it must be maintained at minimum practice level.
2
1
17
Replying to
The military trains constantly on loss-of-technology scenarios for their own infrastructure. Are you sure that civilians don’t?
1

