Intelligence emerges over regulation problems. General intelligence may emerge over regulation problems that are so hard that they require modeling the general conditions of language and existence.
-
-
Replying to @Plinz
It seems to me that 'problems' emerge from intelligence. Without intelligence, I don't think there could be any problems, just events that result in outcomes that are unevaluated.
1 reply 0 retweets 1 like -
Replying to @S33light
Yes, without intelligence you will die before you encounter interesting problems. Intelligence is a toolset that carries you into a zone in which you should be dead. If it is not yours, it is someone else’s.
1 reply 0 retweets 5 likes -
Replying to @Plinz
What could cause an event to become a problem though except some degree of intelligence?
1 reply 0 retweets 0 likes -
A problem is the lack of equilibrium in a system. The equilibrium could be found with random fluctuations (eg evolution). If the system can capture and process information it could compute ways to achieve equilibrium: that's intelligence. General intelligence is just Intelligence
1 reply 0 retweets 0 likes -
Replying to @Abel_TorresM @Plinz
Equilibrium is a qualitative interpretation that adds relationships on top of the physical events. The physical events would only be unrelated facts rather than a problem in search of solutions.
1 reply 0 retweets 0 likes -
the interaction between physical events obey to laws of nature that we try to describe in abstract causal models. why and how a system with such property emerged is an important question for discovering the key principles needed to understand and build artificial intelligence
1 reply 0 retweets 1 like -
We say intelligence is the capacity to solve problems in many environments. Phrased as the mechanisms of computing system’s equilibrium implies e.g. that it isn't an optimization task (as in DL today). Which equilibrium? How? Are the current tasks in this theory of intelligence
1 reply 0 retweets 0 likes -
Replying to @Abel_TorresM @Plinz
I would go further and say that even the capacity to qualify an event as a 'problem' and to be able to attempt to cause events that are qualified as 'solving' it requires intelligence.
1 reply 0 retweets 0 likes -
Right. Just that animals are capable of solving problems without conceptualizing them; so the abstraction required for reasoning is not dependent on symbolic language even when benefits a lot from it
1 reply 0 retweets 1 like
Not sure about that. Planning requires discretization of sequences of events, objects and scenes in ways that are essentially conceptual (just not linguistic).
-
-
Correct. I abused 'conceptualization' as abstractions we can describe with words. Animals can manipulate abstract representations to solve problems. Moreover, our linguistic constructs are ultimately converted into those to acquire meaning (the reason pure NLP is empty)
1 reply 0 retweets 3 likes -
Replying to @Abel_TorresM @Plinz
What I'm looking at I think is even more primitive than abstraction. Prior to representation, there must be a presentation of perceptual content such that direct intervention is experienced as possible and desirable. Intelligence as a refinement of will.
1 reply 0 retweets 1 like - 13 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.