Genetic programming learned operations reminiscent of dropout, normalized gradients, and weight averaging when trying to evolve better learning algorithms. Cool work! https://arxiv.org/abs/2003.03384 https://twitter.com/quocleix/status/1237528603564204033 …
-
-
Grad students still have to compete with gradient descent though!https://twitter.com/shivon/status/1075537179978297344 …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I LOL'd.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Maybe in the very narrow case of improving performance of a neural network on a specific dataset. But what about creating a new problem or changing an existing approach or improving explainability?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Grad Student Descent
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
you made me feel better about my knowledge
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.