Implementing fully connected nets, convnets, RNNs, backprop and SGD from scratch (using pure python, numpy, or even JS) and training these models on small datasets is a great way to learn how neural nets work. Invest time to gain valuable intuition before jumping onto frameworks.https://twitter.com/dennybritz/status/961829329985400839 …
-
-
It's a good thing that the next generation is moving up the abstraction stack, telling students to go back to the start is not a good learning strategy in my opinion
-
100% agree, so much time is wasted relitigating the earlier levels of abstractions when we could be moving on to bigger and better things
End of conversation
New conversation -
-
-
It's fundamental you implement by hand at least backprop and a routine to check the distance between result and the gradient. And that you solve a real problem! I always remark this everytime I talk to someone who approaces nn.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.