The amount of shit an NN can memorize in its parameters, and how dumb this makes it given unexpected data, dwarfs the idiocy of legacy AI/ML
I thought it was "overfitting" which in its trivial forms is avoided by having separate train & test sets, but when both are "biased", tough
-
-
Hmm, yes, "overfitting" seems more common. It's been a while.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.