I would bet against current backprop-based techniques in the long run.https://twitter.com/michael_nielsen/status/669978274701942785 …
-
-
Replying to @michael_nielsen
@michael_nielsen Information geometry / topology will give us the theoretical foundation for building "optimal" learning algorithms.3 replies 2 retweets 14 likes -
Replying to @fchollet
@fchollet@michael_nielsen any recommendations there? I'm reading Amari and Nagaoka (slowly)1 reply 0 retweets 1 like
Replying to @wxswxs
@wxswxs @michael_nielsen That's a great start. Otherwise there have been a number of interesting ML papers on Riemannian manifolds.
0 replies
1 retweet
5 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.