Major news re #deeplearning & its limits: Yoshua Bengio’s lab confirms a key conclusion of Marcus 2001 and 2018: deep learning isn’t efficient enough with data to cope with the compositional nature of language. https://export.arxiv.org/abs/1810.08272
-
-
Replying to @GaryMarcus @dmonett
"We put forward strong evidence that current deep learning methods are not yet sufficiently sample efficient when it comes to learning a language with compositional properties." Key word is yet, Gary.
5 replies 7 retweets 77 likes -
Replying to @roydanroy @dmonett
i have literally being hearing this reply for 20 years, since Marcus 1998. at what point do we acknowledge that might need other techniques, too, rather than deep learning alone?
1 reply 0 retweets 8 likes -
Replying to @GaryMarcus @dmonett
Is there a falsifiable claim we can test in, say, five years?
1 reply 0 retweets 4 likes -
I agree. Is
@GaryMarcus' claim go beyond "neural networks is not enough"? What then is he proposing? Is it 'symbolic machinery'? Is the claim that 'symbolic machinery' cannot be simulated by neural networks?2 replies 1 retweet 1 like
No, my whole 2001 book is about how symbols might be emulated in neural machinery. But we need more structure.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.