Last week: Deep learning is on verge of understanding natural language. Problem of tracking world change over time solved 3 years ago. This week: Nobody ever said statistical language models understand anything.https://twitter.com/tdietterich/status/1191756945226522624 …
-
-
Even on limited words babi tasks what happens when two or more tasks are clubbed together? I wud say the first sign of understanding will come when a system can pass any of the babi tasks individually or collectively with same accuracy.
-
That said, the babi tasks tests very limited aspect of only a few senses. Like in task 14 time manipulation, it only tests ability to understand before/after or timespans like morning or evening. Understanding time involves far more complex aspects than that.
- 6 more replies
New conversation -
-
-
I wonder if one issue is that really simple details are seldom written down. If we had a corpus of all the things parents teach toddlers (maybe children’s tv programming?) more of those fundamental concepts could be resolved statistically.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
DL, alone or not, is the worst way to approach AGI. The brain discovers unknown functions in sensory data. Unlike DL, it doesn't optimize an already known function. Besides, there isn't enough room in the brain to store representations for just a few days, let alone a lifetime.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.