The choice which person a self driving car has to kill will basically not occur in practice. But if we lived in a trolley problem universe, would you want to configure the killing prefs of your car yourself or should the government do it for you?
Asimov's laws are a hamfisted attempt at literally enslaving entities magnitudes smarter than us and won't work.
-
-
Since your question is already working on the basis of enslaving them (and far more "hamfisted" than Asimovs version), I don't see the problem with that o.O
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.