I recently challenged Hinton and @YLecun to stand by the idea expressed in these words from their 2015 Nature paper, by making a public bet. Neither would. I stand by my assertion that is precisely this that holds back deep learning, except brave work by people like @egrefen.https://twitter.com/MLWave/status/1065310317935304704 …
-
-
Replying to @GaryMarcus @ylecun
I mean, a lot of DL research involves trying to produce models which generalise better by encoding priors about the domain, or inductive biases, into architectures and learning methods. From ConvNets to Capsules, there are plenty of examples of this...
1 reply 0 retweets 6 likes
Convolution is a great innate prior — defined in advance to work over all instances of a variable.
0 replies
1 retweet
2 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.