Yann LeCun makes a very clear and honest statement of why Deep Learning is not enough. It's not enough because there are functions that don't emerge intrinsically from the network.https://twitter.com/ylecun/status/1215286749477384192 …
-
-
Replying to @IntuitMachine
This is not what he said here. Every stepwise constructive or evolutionary process is in a sense an optimization via a gradient based method, so DL is a sufficiently broad methodology, but you'll still need to set up the system performing the optimization or meta optimization.
3 replies 0 retweets 9 likes -
Replying to @Plinz @IntuitMachine
So, DL is an architecture AND a methodology AND the mechanism behind all of evolution? Next, you’ll tell me that is also serves to unify the theory of gravity with the standard model of physics!
@GaryMarcus1 reply 0 retweets 4 likes -
No, my understanding is that Yann LeCun thinks of DL as automating the optimization of a program for computing a complex function. Depending on the optimization method, the program or its generator function will often be differentiable.
1 reply 0 retweets 2 likes -
This Tweet is unavailable.
-
Yes, DL is basically compositional function approximation. Before DL, ML was mostly limited to shallow models. For instance, end-to-end training for speech recognition or game playing was considered to be outside of the reach of ML.
2 replies 1 retweet 1 like -
This Tweet is unavailable.
There are many ways of doing that beyond regression, such as Monte Carlo, evolutionary algorithms, enumeration etc.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.