The amount of shit an NN can memorize in its parameters, and how dumb this makes it given unexpected data, dwarfs the idiocy of legacy AI/ML
-
-
Replying to @YossiKreinin
The field's term of art for memorizing shit is "overtraining". It doesn't always happen, but it's definitely a serious hazard.
1 reply 0 retweets 0 likes -
Replying to @NYarvin
I thought it was "overfitting" which in its trivial forms is avoided by having separate train & test sets, but when both are "biased", tough
2 replies 0 retweets 0 likes
Replying to @YossiKreinin
Hmm, yes, "overfitting" seems more common. It's been a while.
12:44 AM - 4 May 2017
0 replies
0 retweets
0 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.