Yann LeCun makes a very clear and honest statement of why Deep Learning is not enough. It's not enough because there are functions that don't emerge intrinsically from the network.https://twitter.com/ylecun/status/1215286749477384192 …
-
-
does that mean no priors? isn't convolution a handcrafted prior? wasn't the structure of AlexNet handcrafted?
-
Yes, but the difference is that DL is moving the handcrafting up one layer: classical AI mostly relied on writing algorithms, current AI mostly relies on writing algorithms that discover algorithms. Perhaps the next wave will be fueled by meta-learning, i.e. 3rd order programming
- 3 more replies
New conversation -
-
-
Would you classify biological inspirationalists as part of Deep Learning? Even if they long ago strayed from perceptron in favor of a more temporal model?

-
Arguably, most of the ideas that enabled present day DL were not novel. DL practitioners sometimes reinvented, sometimes repurposed many solutions from earlier AI research, from econometrics, cybernetics, signal processing, physics, psychology and neuroscience.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.