Conversation

Has anyone named this effect? I have seen it implicitly discussed but not explicitly. Human capability degrading faster than automation capability increases, creating a transient death-trap period where risks are very high. I call it the automation death trap effect.
Image
17
189
People assume human abilities stay constant while automation is improving. They don’t. The general degradation starts the moment you begin trusting the automation for *anything* because unused or rarely used skills degrade at the rate reinforcement schedule slows.
5
35
It’s been a long-standing debate in control theory and AI, but based on a false assumption: that such a thing as unmaintained emergency override skills can exist. Ie that humans deploy behavior X *only* in the emergencies where the automation fails. There is no such thing.
1
17
It is somewhere between malpractice and technical stupidity to base a design on this assumption. When the rate of use of a skill is lower than the rate of practice needed to keep it usable, it must be maintained at minimum practice level.
2
17
Driverless cars are the most familiar example. But any high-risk behavior that requires human backstopping of automation qualifies for this death-trap effect. I think a lot of infrastructure is either in, or getting close to, the death trap zone.
2
25
This is a much more serious and real risk than AGI/singularity BS. Too much infrastructure being in the death-trap state at the same time due to correlated automation wave causing systemic collapses via contagion and 2nd order effects as failures cascade.
3
23