The amount of shit an NN can memorize in its parameters, and how dumb this makes it given unexpected data, dwarfs the idiocy of legacy AI/ML
The field's term of art for memorizing shit is "overtraining". It doesn't always happen, but it's definitely a serious hazard.
-
-
I thought it was "overfitting" which in its trivial forms is avoided by having separate train & test sets, but when both are "biased", tough
-
Hmm, yes, "overfitting" seems more common. It's been a while.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.