This is not what he said here. Every stepwise constructive or evolutionary process is in a sense an optimization via a gradient based method, so DL is a sufficiently broad methodology, but you'll still need to set up the system performing the optimization or meta optimization.
-
-
Replying to @Plinz @IntuitMachine
So, DL is an architecture AND a methodology AND the mechanism behind all of evolution? Next, you’ll tell me that is also serves to unify the theory of gravity with the standard model of physics!
@GaryMarcus1 reply 0 retweets 4 likes -
No, my understanding is that Yann LeCun thinks of DL as automating the optimization of a program for computing a complex function. Depending on the optimization method, the program or its generator function will often be differentiable.
1 reply 0 retweets 2 likes -
This Tweet is unavailable.
-
Yes, DL is basically compositional function approximation. Before DL, ML was mostly limited to shallow models. For instance, end-to-end training for speech recognition or game playing was considered to be outside of the reach of ML.
2 replies 1 retweet 1 like -
I accept that but now I have a provocative question. is deep learning possible outside the context of artificial neural networks?
2 replies 0 retweets 0 likes -
Yes, of course. CNNs are just the most popular area right now. I am wondering about a paradigm where each individual unit is an evolving reinforcement learner that utilizes an arbitrary shader program.
2 replies 0 retweets 1 like -
This Tweet is unavailable.
-
A small piece of code that runs in parallel with many others on a GPU.
0 replies 0 retweets 0 likes -
This Tweet is unavailable.
no
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.