Deep Learning is not just multilayer neural networks, but it refers to compositional function approximation in general. Chains of normalized weighted sums happen to be surprisingly useful and easy to deal with, but ultimately, we want to learn ANY kind of efficient algorithm.
-
-
Replying to @Plinz
All models are wrong. Some models are useful. Gluing together just the useful bits of a bunch of models produces a Franken-model that is less wrong. Welcome to the age of evolved splines.
1 reply 0 retweets 1 like -
Replying to @arigesher
I don't yet understand why all models are wrong. Models of mathematics are surprisingly often correct, and every implementation must have a ground truth, to which a model can in principle often be isomorphic.
1 reply 0 retweets 0 likes -
Replying to @Plinz
Math is a purely abstract domain - models are the truth. (They probably shouldn’t be called models) For everything else, models must fall short - or they would be the thing itself. An model is, by definition, an incomplete representation of some larger phenomenon.
1 reply 0 retweets 0 likes
Hm. I understand a model as a function that is formed by an observers to predict the behavior of a domain. In the narrow sense, models describe invariance. If understood like this, models of mathematics and physics are both models, and both can be exact or approximative.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.