Uber grounded its self-driving cars because of an accident, where the algorithms failed to account for a real world condition. But...
-
-
Convolutional NN stuff is great for object detection and classification, but ultimately it's a controls problem to solve. And that's hard.
-
it's interesting as it's like human drivers in that regard. not really testable / reliable 100%. needs a mentality change?
-
We have a better understanding of how human learning works.
-
If we say "by the way in this circumstance you should do this" we know roughly how people's driving will change.
-
I don't disagree, but in a lot of cases the behavior change won't happen at all
-
this is why I think I haven't seen anybody argue that the complete functionality needs 2 b verified
-
most argument I've seen are that less accidents is "enough"
- 5 more replies
New conversation -
-
-
"...flight is a simpler case than driving"-
@EmilyGorcenski Tesla still refuses publicly acknowledge it. Just did some cosmetically changespic.twitter.com/LaW7bvH9wu
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
