It's not that it isn't scary. It's that it doesn't even begin to make sense.
-
-
-
Ofc, you can make a case for regulating the *deployment* of AI -- based on fear of algorithmic bias, security, etc. Just like any other tech
- Show replies
New conversation -
-
-
There was actually a race between public and private enterprises to sequence the human genome with fear of gene patents still arising...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Do you think that genome sequencing should not be regulated?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
A priori thoughts with an A posteriori example. The image is nice but don't forget to consider the unknown unknowns (military projects)
-
Bayesian AI is not very much depicted nowadays compared to deeplearning and big data based AI, but it exists and can be efficiently combined
End of conversation
New conversation -
-
-
I like to agree with you, but this is a bad example for several reasons...
-
1. You compare something of the past with a fear about the future
- Show replies
New conversation -
-
-
I'd be a lot more concerned about CRISPR than Tensorflow
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think the main difference is that people can distinguish genome sequencing and genome editing.
-
But anyone doing statistics or/and optimisation could be accused of developing SkyNet AI...lol
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.