Usual reminder: when I've been saying for the past 5+ years that deep learning is interpolative, I don't mean it does linear interpolation in the original encoding space (which would be useless). It does interpolation on a low-dimensional manifold embedded in the encoding space.
-
-
Not really. There are different settings of course. One cool setting is these neuro-differential solvers (three body problem) etc. I think they only solve problems with local condition number <= that in the train data set.
-
For example, if those navier-stokes or 3body interpolators can guarantee error bounds in regions where the local condition numbers are greater than they ever saw that would be huge. I would definitely call that generalization.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.