Conversation

alright just for fun: AMA but only about math, will attempt to speed-explain stuff with as few symbols and equations as possible and see what happens (esp happy to field questions about stuff that seems basic to you and that you feel like you should've gotten a long time ago!)
54
5
76
This Tweet was deleted by the Tweet author. Learn more
there's a half-joke i wish i had a canonical version to quote that is like "mathematicians only understand linear algebra, and they do everything else by reducing it to linear algebra" and a lot of it is that sort of thing i think
1
the nice thing about neural networks, as i understand it, is 1) they're defined by a bunch of continuous parameters in a reasonably straightforward way, 2) it's reasonably straightforward to compute the gradient, and 3) therefore we can do gradient descent to optimize them
1
the rest of this is irresponsible speculation about something i am not at all an expert in: i think it's mostly an empirical fact lacking strong theoretical justification at this point that neural networks work as well as they do? unclear
1