Implementing fully connected nets, convnets, RNNs, backprop and SGD from scratch (using pure python, numpy, or even JS) and training these models on small datasets is a great way to learn how neural nets work. Invest time to gain valuable intuition before jumping onto frameworks.https://twitter.com/dennybritz/status/961829329985400839 …
-
-
I think it's a combination of both.
@DavidDuvenaud and his colleagues taught me how to write neural nets from scratch using Python + autograd, and that gave me an intuition of the internals. Applying it to problems gave me an intuition of what it could do. I needed both!Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Plenty of ppl are proficient with NNs without ever having implemented their own -- they know how it works, but never dealt with the impl details. 10 years from now this will be 90% of us. Same as how current SWEs know how an OS works but have never built their own toy OS
-
It's a good thing that the next generation is moving up the abstraction stack, telling students to go back to the start is not a good learning strategy in my opinion
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.