Tune hyperparameters to get SOTA on some ML benchmarkhttps://twitter.com/generativist/status/1412813949012086788 …
-
-
Architecture search? You're are thinking small. You could be doing so much more... like new algos search.
-
Strictly speaking, algos are no longer necessary in Software 2.0.
- Show replies
New conversation -
-
-
ML benchmarks would definitely best the best contribution to humanity. Can’t think of anything else.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
... and comparing the architectures via nested or k-fold cross-validation instead of a single test set :)
- End of conversation
New conversation -
-
-
Architecture search? You're thinking small. You could be doing so much more... like field search.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Architecture is like another hyperparameter. An ultraparameter?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I guess this depends on the size of the infinite. Can you search in architecture space without aleph-one infinite power?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Could somebody please help explain why this would be a bad idea? I always thought this would be great, but I’m ignorant and still learning about the field. Wouldn’t architecture search result in the best models?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.