if nobody bites on question 3, it will make my next medium post a lot easier to write :)https://twitter.com/GaryMarcus/status/1209640096900812800 …
-
-
Replying to @GaryMarcus
3. Anything that doesn't operate over a vector space is not deep learning.
1 reply 0 retweets 2 likes -
Replying to @EliSennesh @GaryMarcus
3. RCN +CHMMs (Vicarious), Bayesian Program Learning (Lake), most probabilistic programming, pure spectral / PCA meths 4 extracting features, most evol. programming/GAs, certain “hybrid systems” like recent Tenenbaum... whatever doesn’t rely on backprop as core engine of learning
2 replies 1 retweet 6 likes -
we can use backprop to train CHMMs.., EM just works better than backprop in this case. Also, with query-training, we are starting to use backprop for training pieces of RCN. IMO it is too all encompassing to say using grad=DL, when most ideas about structure came from elsewhere.
1 reply 0 retweets 4 likes -
Replying to @dileeplearning @AdamMarblestone and
Yes, it is like the Borg. Anything that uses SGD is assimilated as Deep Learning. If a race becomes assimilated into the Borg, does that race also become part of the Borg? Of course!!!
2 replies 0 retweets 4 likes
“You will be assimilated.” Merry Christmas.
-
-
Replying to @AdamMarblestone @IntuitMachine and
I don’t mind it at all :) merry Christmas to you too!
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.