Implementing fully connected nets, convnets, RNNs, backprop and SGD from scratch (using pure python, numpy, or even JS) and training these models on small datasets is a great way to learn how neural nets work. Invest time to gain valuable intuition before jumping onto frameworks.https://twitter.com/dennybritz/status/961829329985400839 …
-
-
Understanding is great, until you hit a bug. Going down to the bottom is a reminder of how many *implementation details* actually matter. Random seeds, tie breaking in max-pool, whether the gradient is correct (eps tolerance can still be broken when chained), others.
-
Examples - SVD only deterministic up to sign (S is sqrt of eigenvalues), based on implementation, border filling in convolution, bias initialization, numerical stability in running mean/variance (http://www.dtic.mil/dtic/tr/fulltext/u2/a133112.pdf …), numerical stability of losses in prob/log space, etc.
- Show replies
New conversation -
-
-
Would you say it's hands on learning?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I'd disagree they did not have good intuitions when I look at old papers by Hinton and Schmidhuber. They had perfectly amazing intuitions, possibly better ones because they could not as easily make the pixels fly in GANs and had to use their brains for a bit before engineering.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I wrote my first neural nets (actually in C) as a student in 2009. Taught me more about C than about NNs. It took me a few more years until I started understanding what neural nets do and what makes them useful. The key was better tools + application to many real-world datasets
-
I think it's a combination of both.
@DavidDuvenaud and his colleagues taught me how to write neural nets from scratch using Python + autograd, and that gave me an intuition of the internals. Applying it to problems gave me an intuition of what it could do. I needed both!
End of conversation
New conversation -
-
-
Which framework is that? I want to learn!!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Abstraction is absolutely necessary to be able to experiment quickly & to reason about more complex things. Most of mathematics is also about creating abstractions. However all abstractions are leaky (https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/ …) and sooner or later bite you in unmentionable places.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.