2/ What is deep learning? Is Yoshua calling for hybrid systems that utilize non-deep, traditional symbol processing mechanisms? No, he's not. Many ppl, including @GaryMarcus, have a tendency to associate deep with its most prevalent form, but it is a far more general approach.
-
-
3/ If you read the early papers (e.g. from
@ylecun and Yoshua http://www.iro.umontreal.ca/~lisa/bib/pub_subject/language/pointeurs/bengio+lecun-chapter2007.pdf …), you see DL is a research *program* with two key elements: 1) learning is better, avoid built-in assumptions if you can, 2) use hierarchical, distributed representations trained end-to-end.1 reply 2 retweets 15 likes -
Replying to @tyrell_turing @apeyrache and
4/ Supervised training of NNs with backprop fits into that program, but does not define it. What Yoshua has been saying, as in the interview
@GaryMarcus quoted, is that we want things like causal reasoning, but we want to *learn* it using distributed, hierarchical systems.1 reply 1 retweet 10 likes -
Replying to @tyrell_turing @apeyrache and
5/ My reaction to
@GaryMarcus's call for hybrid systems in that essay is that he doesn't sufficiently recognize that good hybrid systems will (1) represent P and Q in a distributed fashion, (2) *learn* P and Q and the form of their relationship. That's surely how brains do it.2 replies 1 retweet 18 likes -
Replying to @tyrell_turing @apeyrache and
6/ That's not to say that you don't build in some stuff - structure matters and
@GaryMarcus is right that hybrid approaches are likely needed, but what form of hybrid?1 reply 0 retweets 1 like -
Replying to @tyrell_turing @apeyrache and
7/ IMO, the answer is not to tack on traditional symbolic approaches to deep conv nets, or whatever, but to build structure into ANNs, either through thinks like relational networks (http://papers.nips.cc/paper/7082-a-simple-neural-network-module-for-relational-reasoning.pdf …) or graph approaches (https://arxiv.org/pdf/1806.01261.pdf …).
4 replies 0 retweets 6 likes -
Replying to @tyrell_turing @apeyrache and
This is key - I suspect that these and similar approaches (which are sociologically associated with "deep learning" folks) address many of Gary's concerns about compositionality and symbols, but I typically don't them recognized or mentioned in Gary's commentaries.
2 replies 0 retweets 1 like -
Replying to @tallinzen @apeyrache and
Right, I think there’s some sociology going on here that’s not helpful. Most ppl agree about where we want to be, and the need for hybrid approaches.
1 reply 0 retweets 4 likes -
Replying to @tyrell_turing @apeyrache and
Exactly. This conversation would be more productive if specific, recent technical proposals were discussed (or new models were proposed). I was encouraged to see
@egrefen's work mentioned explicitly, but framing it as some renegade exception that proves the rule seems unnecessary1 reply 0 retweets 4 likes -
Replying to @tallinzen @tyrell_turing and
it is somewhat unusual but i have pointed so several others in my writing this year, at multiple points.
1 reply 0 retweets 2 likes
@LittleBimble and @egrefen were most relevant though to the specific point i was making
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.