Great post but shouldn't we talk about "obstacles" rather than "limits" ? I mean nothing seems to indicate that the answers will be not-DL.
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
interesting analysis by Chollet. I agree w/ most of it, but he doesn't discuss efforts to add memory and reasoning to deep nets
-
Memory and reasoning (and more) are addressed in tomorrow's follow-up post :)
- Show replies
New conversation -
-
-
Deep: "Need move from straightforward input-to-output mappings and to reasoning and abstraction", cos ANN can only do local generalization
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
How about relation neural nets that learn to operate in a permutation invariant way? Aren't those better with transfer learning?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
"Show them anything that deviates from their training data, and they will break in the most absurd ways." Yet many people have the gall to
-
claim this or that DL-based technique has "super-human" performance.
End of conversation
New conversation -
-
-
Also, by equipping a neural net with a domain-specific simulator it would be possible to simplify what the neural net has to learn.
-
Many different things can be done -- there's a whole unexplored world out there. I have some thoughts as well, you'll see tomorrow
- Show replies
New conversation -
-
-
But what you call a "limitation" stands for an ill-posed problem. Solutions can still be found in some weak sense.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.