The Uber self-driving car death is not a big deal because it's a self-driving car and they're actually safer. It's a big deal because it's not actually clear if they companies can be held liable for the death.
-
-
Yes, I've read "dozens" of articles on this and even the two you mentioned do not conflict with what I said - if there is control surfaces available for human to use, then it is the humans responsibility as the "driver". Very simple, very common sense.
-
It is completely different thing when the cars autonomous system fails and that is something a "driver" (operator?) can't predict and only then becomes the software developer or part manufacturer responsibility into question.
-
I disagree. Imagine it were a bus. The driver kills someone, but you were on the bus & "did nothing". Bus driver to blame, clearly. If the bus driver is a computer, or a driverless tram, are you the passenger suddenly to blame? The wheel was right there, after all.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.