Can deep neural networks learn everything and reproduce any cognitive abilities? No matter your opinion on the question, @GaryMarcus ‘s counter-argument is thought provoking. The 2001 book is a great (but longer) read.https://medium.com/@GaryMarcus/the-deepest-problem-with-deep-learning-91c5991f5695 …
-
Show this thread
-
-
Replying to @apeyrache
1/ I'll preface this small reply thread with the following: I'm responding to the linked Medium post, not
@GaryMarcus's 2001 book, which I haven't had a chance to read yet (though it's on my list all to continuously expanding list).2 replies 1 retweet 6 likes -
2/ What is deep learning? Is Yoshua calling for hybrid systems that utilize non-deep, traditional symbol processing mechanisms? No, he's not. Many ppl, including
@GaryMarcus, have a tendency to associate deep with its most prevalent form, but it is a far more general approach.1 reply 1 retweet 8 likes -
3/ If you read the early papers (e.g. from
@ylecun and Yoshua http://www.iro.umontreal.ca/~lisa/bib/pub_subject/language/pointeurs/bengio+lecun-chapter2007.pdf …), you see DL is a research *program* with two key elements: 1) learning is better, avoid built-in assumptions if you can, 2) use hierarchical, distributed representations trained end-to-end.1 reply 2 retweets 15 likes -
Replying to @tyrell_turing @apeyrache and
4/ Supervised training of NNs with backprop fits into that program, but does not define it. What Yoshua has been saying, as in the interview
@GaryMarcus quoted, is that we want things like causal reasoning, but we want to *learn* it using distributed, hierarchical systems.1 reply 1 retweet 10 likes -
Replying to @tyrell_turing @apeyrache and
5/ My reaction to
@GaryMarcus's call for hybrid systems in that essay is that he doesn't sufficiently recognize that good hybrid systems will (1) represent P and Q in a distributed fashion, (2) *learn* P and Q and the form of their relationship. That's surely how brains do it.2 replies 1 retweet 18 likes -
Replying to @tyrell_turing @apeyrache and
6/ That's not to say that you don't build in some stuff - structure matters and
@GaryMarcus is right that hybrid approaches are likely needed, but what form of hybrid?1 reply 0 retweets 1 like -
Replying to @tyrell_turing @apeyrache and
7/ IMO, the answer is not to tack on traditional symbolic approaches to deep conv nets, or whatever, but to build structure into ANNs, either through thinks like relational networks (http://papers.nips.cc/paper/7082-a-simple-neural-network-module-for-relational-reasoning.pdf …) or graph approaches (https://arxiv.org/pdf/1806.01261.pdf …).
4 replies 0 retweets 6 likes -
Replying to @tyrell_turing @apeyrache and
Fin/ A lot of what
@GaryMarcus says is reasonable, but (1) failures of supervised trained conv nets is not an argument against the general DL program (2) we should solve the problems he's highlighting, but using learning and distrib. reps when possible. That's Yoshua's takeaway.1 reply 0 retweets 2 likes
On (1) what do you see as within and out of scope of the general DL program? On (2) agreed we should use learning a lot, just not for everything (see my arXiv on AlphaGo and Innateness). On distributed representations: I am agnostic, not opposed.
-
-
IMO, the DL program includes any approach that follows the principles Yann and Yoshua lay out in that 2007 paper: (1) learn as much as possible, (2) use hierarchical, distributed representations.
1 reply 0 retweets 2 likes -
Replying to @tyrell_turing @GaryMarcus and
We agree on a lot, but why remain agnostic on distrib. reps? They’re useful for generalization and constraint satisf. Moreover, they are what the brain uses. Add in the fact that high D systems have potent learning algs, and I see no reason not to embrace them wholeheartedly.
0 replies 0 retweets 2 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.