Conversation

Has anyone named this effect? I have seen it implicitly discussed but not explicitly. Human capability degrading faster than automation capability increases, creating a transient death-trap period where risks are very high. I call it the automation death trap effect.
Image
17
189
Replying to
It’s been a long-standing debate in control theory and AI, but based on a false assumption: that such a thing as unmaintained emergency override skills can exist. Ie that humans deploy behavior X *only* in the emergencies where the automation fails. There is no such thing.
1
17
It is somewhere between malpractice and technical stupidity to base a design on this assumption. When the rate of use of a skill is lower than the rate of practice needed to keep it usable, it must be maintained at minimum practice level.
2
17
Driverless cars are the most familiar example. But any high-risk behavior that requires human backstopping of automation qualifies for this death-trap effect. I think a lot of infrastructure is either in, or getting close to, the death trap zone.
2
25
This is a much more serious and real risk than AGI/singularity BS. Too much infrastructure being in the death-trap state at the same time due to correlated automation wave causing systemic collapses via contagion and 2nd order effects as failures cascade.
3
23
This Tweet is from a suspended account. Learn more
Replying to
Hmm I don’t think so. There is little to no element of real-time risk in that case. Just accumulating inefficiency. Worst case, you forget something and have to look it up. Your car doesn’t crash in the meantime.
1
Show replies
Replying to
how much of this do you think is manufactured via systems like DRM that prevent humans from compensating for poorly functioning automation?