One interesting thing about the ARC competition is that it serves to highlight how people who use deep learning often have little idea of what deep learning actually does, and when they should be using it or not
-
-
Differentiability & minibatch SGD are the strengths of DL: besides making the learning practically tractable, the smoothness & continuity of the function & the incrementality of its fitting work great to learn to approximate latent manifold. But its strengths are also its limits
Show this thread -
The whole setup breaks down when you are no longer doing pattern recognition -- when you no longer have a latent manifold (any kind of discrete problem) or no longer have a dense sampling of it. Or when your manifold changes over time.
Show this thread -
New conversation -
-
-
I like the intuition here; would you happen to know works that test these ideas experimentally?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
IMO, this is what most people don't get, and why there is a lot of DL project that struggles. This should be more advertised.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.