A model compresses a state space by capturing a set of invariances that predict the variance in the states. Its free parameters define the latent space of the model and should ideally fully correspond to the variability, the not-invariant (= unexplained) remainder of the state.
It seems that universal computation and Bayes are at the root of modeling and learning.
-
-
Ok, the layers in learning/modeling are: computation + bayes <--> languages that can be used to reason <--> reason < --> perceptual models. Coherence among these means you have a proper model??
-
Understanding means that we think we found a mapping to a thing we think we can already compute. (We rarely prove any of that to ourselves.)
End of conversation
New conversation -
-
-
P.s. if you ever teach an online course on the nature of things, I'm super into buying it!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.


