Generalization in deep learning is interpolation along a latent manifold (or rather a learned approximation of it). It has little to do with your model itself and everything to do with the natural organization of your data
-
-
i.e. to generalize. How course the term "structure" is completely informal so not accurate here, but not that's not a convo for Twitter
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
That's what I stated. (Your original statement was "it breaks down .. when you no longer have a latent manifold (any kind of discrete problem)"), that's what I disputed. OTOH sampling in Alphago is not really dense: number of examples is quite small for a manifold of that dim.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.