Equilibrium is a qualitative interpretation that adds relationships on top of the physical events. The physical events would only be unrelated facts rather than a problem in search of solutions.
-
-
the interaction between physical events obey to laws of nature that we try to describe in abstract causal models. why and how a system with such property emerged is an important question for discovering the key principles needed to understand and build artificial intelligence
1 reply 0 retweets 1 like -
We say intelligence is the capacity to solve problems in many environments. Phrased as the mechanisms of computing system’s equilibrium implies e.g. that it isn't an optimization task (as in DL today). Which equilibrium? How? Are the current tasks in this theory of intelligence
1 reply 0 retweets 0 likes -
Replying to @Abel_TorresM @Plinz
I would go further and say that even the capacity to qualify an event as a 'problem' and to be able to attempt to cause events that are qualified as 'solving' it requires intelligence.
1 reply 0 retweets 0 likes -
Right. Just that animals are capable of solving problems without conceptualizing them; so the abstraction required for reasoning is not dependent on symbolic language even when benefits a lot from it
1 reply 0 retweets 1 like -
Replying to @Abel_TorresM @S33light
Not sure about that. Planning requires discretization of sequences of events, objects and scenes in ways that are essentially conceptual (just not linguistic).
1 reply 0 retweets 2 likes -
Correct. I abused 'conceptualization' as abstractions we can describe with words. Animals can manipulate abstract representations to solve problems. Moreover, our linguistic constructs are ultimately converted into those to acquire meaning (the reason pure NLP is empty)
1 reply 0 retweets 3 likes -
Replying to @Abel_TorresM @Plinz
What I'm looking at I think is even more primitive than abstraction. Prior to representation, there must be a presentation of perceptual content such that direct intervention is experienced as possible and desirable. Intelligence as a refinement of will.
1 reply 0 retweets 1 like -
Replying to @S33light @Abel_TorresM
No, there is no "presentation". Before representation, there are only patterns. Representation creates the structure, as a way to explain the patterns.
2 replies 0 retweets 1 like -
Replying to @Plinz @Abel_TorresM
Patterns are the presentation. Before I can say that a red octagon is a 'sign' that represents the obligation to STOP, I have to be presented with a red octagon. It need not even be considered a pattern - it is a directly visible presence.
2 replies 0 retweets 0 likes
Again, no. The octagon requires establishing shapes over gestalts over edges over adjacency relations. Each layer has to be constructed in a learning process and is not immediately given.
-
-
Replying to @Plinz @Abel_TorresM
The gestalts have to be presented first before anything can be learned about them. The gestalt is visible and it need not represent anything at all.
1 reply 0 retweets 0 likes -
Replying to @S33light @Abel_TorresM
From the perspective of your brain, the patterns on the retina are not even ordered before it learns cooccurrence statistics. After learning a lot of those, it can get a 2d map, but not a gestalt yet. Try and code it, you'll see.
3 replies 0 retweets 1 like - 8 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.