Major news re #deeplearning & its limits: Yoshua Bengio’s lab confirms a key conclusion of Marcus 2001 and 2018: deep learning isn’t efficient enough with data to cope with the compositional nature of language. https://export.arxiv.org/abs/1810.08272
-
-
I will point out that a recent paper: https://arxiv.org/pdf/1810.10531.pdf … reveals that the semantic symbolic representation you seek are indeed already present in existing neural networks. Therefore, falsifying your theory. Comments?
-
I have seen people fit requirements to their design. It seems to be the case with AI as well. Its like a generic graph/mathematical data structure is been developed and we try to fit common sense logic to it. Need is to design a DS that represents common sense inherently. 1/2
- 4 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.