The crash happened because the human failed to intervene and because the computer, which clearly detected the need for breaking seconds before, wasn’t allowed to intervene (by humans). Let that sink in for a second.https://twitter.com/wsj/status/999653002863693824 …
Is there conclusive evidence that 1.3s would have been too late? I don’t know and I am not sure either way. Unskilled and untrained drivers are not an excuse but a major issue. Worst case scenario the vehicle should drive as human only and technology should not make things worse.
-
-
The bottom line is that a system that declared that an emergency maneuver was necessary (before the human reaction time) was not allowed to do its job (probably because it is over sensitive at this stage?) and that the human failed to act in time.
-
I can't see how you can expect this to work. The human is not expected to be actively driving the car, but supposed to be able to understand that the system is not reacting and solve the issue. It's not possible. Those people are just scapegoats.
End of conversation
New conversation -
-
-
That car moving at 43mph can't stop in 1.3 seconds, just slow down. But the pedestrian was detected 6 seconds before. And that was enough time. I think that technology should make things better, and I am not sure that it can at the moment.
-
Slowing down sometimes it is just enough to save someone. I don’t care for autonomous driving, I care for aids that help in emergency situations. Technology certainly helps in that regard (ABS, airbags) and rarely make things worse.
-
When things go wrong, and assisted driving technology is involved, we always forget about what would have happened if only the human driver would have been there. Typically the the answer is exactly the same or worse. Certain tech inspires less attention, and that is an issue...
-
...but this was a driver specifically hired to test systems and therefore should had expectations of errors/situations happening at any given time.
-
She was just out of jail and did not have any special qualification/skill. My impression is that this kind of people are the perfect scapegoats for those corps. And the fact that the braking system was disabled can be assessed independently?
End of conversation
New conversation -
-
-
Uber and Tesla published their own findings about crashes, based on their own logs. Can we trust this information?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
There are massive liability issues around self driving cars and no way to independently investigate how those systems work. Who pays if the self-driving car you own kills someone? What happens if the manufacturer says that you disabled some safety systems or ignored warnings?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's more about raising stock prices with big announcements than really deploying a safe product. This is risky.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
There is a race to be the first one providing self driving cars, and I think competition is good as long as you deploy things when they are ready, without taking shortcuts that can endanger people and, at the end, even kill the technology.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.