A model compresses a state space by capturing a set of invariances that predict the variance in the states. Its free parameters define the latent space of the model and should ideally fully correspond to the variability, the not-invariant (= unexplained) remainder of the state.
-
-
Most of our models are perceptual. Proofs in perceptual models are conducted by creating global coherence between all perceived features via propagating the state of the free parameters along their relationship functions to each other until a stable configuration is reached.
Show this thread -
Reasoning is not just an alternative to perception. It is a tool that combines a toolbox of algorithms that are being used to repair perceptual models that don't achieve full coherence.
Show this thread
End of conversation
New conversation -
-
-
Ok, I'm not smart enough for this thread, but I want to understand! Could you give an example? Sorry if this is a dumb request.
-
Discovered Invariance in the data = structure of the model (a set of variables with value ranges and a set of computational relationships between them) Discovered Variance in the data = the set of values of the variables that will explain most of the observations I am making
- 4 more replies
New conversation -
-
-
2) Here's a probably-wrong paraphrase of the above: Models are proprer (truer?), when they reason about the invariances they use. This reasoning itself is a model, and so, is also depends on 'reasons' which are "the foundations of its semantics" - Um, but what are those?



-
It seems that universal computation and Bayes are at the root of modeling and learning.
- 2 more replies
New conversation -
-
-
Ok, now I am stuck here! (No pressure to respond, I don't mean to use all your valuable thought cycles!). I think I have two issues. 1) What is an invariance? (is it a mathemaical description of a state space which stays the same for all possible input patterns?).
#naivequestion -
An invarance is an aspect of the data that does not change. For instance, all even numbers have in common that their lowest bit is 0. All Russian words have in common that none contains an h. All cells contain carbon.
- 3 more replies
New conversation -
-
-
(like how the abstract terms like 'state space' or 'latent space' would apply to a particular model)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.