This is one of those things that would be huge news if circumstances hadn't obsoleted the underlying technology. http://www.dailymail.co.uk/sciencetech/article-3662656/The-AI-Gun-beat-military-s-best-Pilots-hail-aggresive-dynamic-software-losing-repeatedly.html …
Nope, just the things with greater than N% chance of killing them. One could quibble about N; spaceflight's is ~2.5% per trip.
-
-
Are we limiting ourselves by requiring a limited risk of life? Or can machine learning get us to a small enough N% of death?
-
And are there industries where simulation cannot mitigate risk enough?
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.