Late 2016, deep learning is mature enough that we can start exploring much more ambitious ideas. We have a solid foundation to build upon.
-
-
I disagree. I'm going to subscribe to Jordan's view that statistics, not biological plausibility is what's driving today's models.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
agreed. Eg streaming normalization, or Hinton fast weights?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
we might end up getting a lot of cool stuff (besides a lot of stuff that won't go beyond toy problems), so I look fwd to that :)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
.
@fchollet I'm so glad I'm reading this right before the Star Trek end-credit :)pic.twitter.com/wcCW8hrnZ9
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This Tweet is unavailable.
-
aw, not necessarily! one can go up instead of down: there's higher-level organization to investigate too :)
End of conversation
-
-
-
and its gonna look like RNNs inside convolutions, RNNs inside RNNs. RNNs estimating RNNs parameters. RNNs that spike. Poisson VAE
-
RNNs that subtly change the training set to fit expectations?
- Show replies
New conversation -
-
-
inspired is key word. No one is talking about copy+paste brain=skynet. Lots to learn about organizing principles at different res
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.