Having access to "infinite compute" is absolutely not a necessary factor for doing good research. In more ways than one, the opposite is truehttps://twitter.com/benhamner/status/966823205842243584 …
-
-
The trap of compute is the temptation to substitute research for optimization
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I guess that as long as improving 0.3% in a classification task will be considered worth-publishing, there is no escape from that.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Do you feel we reached a point with imagenet where we are essentially optimizing for this specific dataset (even the test set) ?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I don't actually think that's a fair characterization of the goal of ML optimization/AutoML methods. Some level of model optimization is necessary to find a good model, even when the goal is to use that model to explain the process underlying the data.
-
Yes you can go overboard with optimization and massively overfit, but that's why cross-validation, regularization, and related methods are so important as a part of the ML optimization process.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.