The choice which person a self driving car has to kill will basically not occur in practice. But if we lived in a trolley problem universe, would you want to configure the killing prefs of your car yourself or should the government do it for you?
Is it morally acceptable to let others make moral choices for you when you predict that you may not agree?
-
-
I don't personally believe there's a good choice. So, a decision based on a universal rule would be less painful. And if it's wrong -somehow-, I wasn't alone in the decision at least
-
not being alone in a bad decision does not give me any consolation, but I value the ability of institutions to provide a buffer against my bounded rationality
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.