Many people in engineering believe that to understand something, it is necessary and sufficient to have a low-level mathematical description of that thing. That you need to "know the math behind it". In nearly all cases, it is neither sufficient nor at all necessary - far from it
-
-
(coming from someone who had to implement backprop a lot in the past, first in C, then in Matlab, then in Numpy)
Show this thread -
In addition, if you have the right mental model for something, it is generally easy to work out the algorithmic details on your own when you need them, at least down to a level where you can roll out a working implementation (& it becomes trivial if you can just look up details)
Show this thread -
Similar to how, say, you can always reinvent the Pythagorean theorem on the fly if you think about geometry through the lens of vector products, or how you don't need to memorize the quadratic formula if you understand what an equation is and the general process for solving them
Show this thread
End of conversation
New conversation -
-
-
wouldn't it be useful to understand it if you wanted to use deep learning to make a GAN, but you can't use backpropogation because you're dealing with undifferentiable things, or like a spiking network or something?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
True. But coding up backdrop does give you insight into how it's supposed to work. that knowledge can be useful in unexpected ways as you work your way through the mental models of how deep learning works.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.