I do agree that this has nothing to do with developing and improving upon self driving cars' AI, but when it eventually becomes a viable tech they'll need to make such judgements/precautions on road they operate on. What happens then 8f we're not ready?
-
-
-
I believe in saving all lives where possible via the use of iot with solar powered sensors on the roads communicating with cars hence influencing their decisions and averting unnecessary casualties. Whether it be done heuristically or via retraining the models with said input.
End of conversation
New conversation -
-
-
It's odd that the trolley problem isn't covered in the driving test?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
There are real-life situations where a car might need to “decide” between 2 bad options. E.g. hitting a pedestrian or swerving - possibly killing its passenger.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
A similar problem happens in "I, robot" (the movie, at least) The value alignment problem is real, also true for AI running on wetware (AKA companies) The trolley problem is a false dilemma, a bad oversimplification that distracts from the real problemshttps://twitter.com/trylks/status/1064249137653260290?s=19 …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This Tweet is unavailable.
-
This Tweet is unavailable.
- Show replies
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Agreed. The self driving car should be driving according to the speed limit and if a grandma and child are in the road are irrelevant because on a residential street w/ cross walks the car shouldn't exceed 25 mph and the car should have sufficient brakes to stop.
#BuiltInSafetyThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.