exactly what i have been arguing, for two decades (eg The Algebraic Mind) and in my 2018 critical appraisal of deep learning cc @stanfordnlp https://twitter.com/weihua916/status/1025137101040508931 …
-
This Tweet is unavailable.
-
Replying to @GaryMarcus @stanfordnlp
And of course the way you extrapolate is to change the representation so that extrapolation becomes interpolation. "No extrapolation without representation"
4 replies 4 retweets 30 likes -
And to quote Les Valiant: "Representation precedes learning"https://www.researchgate.net/publication/223211826_Robust_logics …
2 replies 3 retweets 12 likes -
Replying to @AvilaGarcez @tdietterich and
Representation: that's what deep learning is all about.
4 replies 3 retweets 16 likes -
Replying to @ylecun @AvilaGarcez and
Learned convolutional filters are certainly a representational advance. Word embeddings certainly capture some important regularities. What else would you consider to be interesting learned representations?
5 replies 1 retweet 6 likes -
Replying to @tdietterich @ylecun and
The compositional semantics of human language
3 replies 0 retweets 2 likes -
Replying to @GaryMarcus @ylecun and
Has the learning of compositional semantics been demonstrated?
2 replies 0 retweets 2 likes
By children, every day.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.