The amount of shit an NN can memorize in its parameters, and how dumb this makes it given unexpected data, dwarfs the idiocy of legacy AI/ML
-
-
Replying to @YossiKreinin
The field's term of art for memorizing shit is "overtraining". It doesn't always happen, but it's definitely a serious hazard.
1 reply 0 retweets 0 likes -
Replying to @NYarvin
I thought it was "overfitting" which in its trivial forms is avoided by having separate train & test sets, but when both are "biased", tough
2 replies 0 retweets 0 likes
Replying to @YossiKreinin
Garbage in, garbage out holds for any sort of learning. What makes neural networks shallow memorizers is that it's hard to train deep ones
10:20 PM - 6 May 2017
0 replies
0 retweets
0 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.