Symbolic and connectionist AI don't really refer to different kinds of making computers compute, but to different political camps among people who don't know how to build AI yet.
-
-
Replying to @Plinz
Do you expect a supercategory that deflates the soi-disant differences between the two, or a more orthogonal rejection?
1 reply 0 retweets 0 likes -
Replying to @jpt401
Intelligence is the creation of models, that is, function approximation (often in the service of some regulation). Some functions are discrete and low dimensional enough to yield to symbolic operators, and many are not, and we have not discovered all the right approximators yet.
1 reply 0 retweets 1 like -
I don't think that the search for function approximators should be done manually, it is itself a problem that yields to automatic function approximation. Eventually, AI might be the result of the automation of the search for meta-learners (which for our own minds was evolution).
1 reply 0 retweets 1 like -
Replying to @Plinz
Very well, then have you any opinion on which current techniques are patrilineal to the future lowest rung, manually constructed, seed approximator? AIXI/Goedel Machine, perhaps?
2 replies 0 retweets 0 likes -
Replying to @jpt401
No, AIXI is by itself useless as an implementation spec, because it does not tell you how to efficiently converge on efficiently computable functions.
1 reply 0 retweets 0 likes -
Replying to @Plinz
Agreed, I reference it as characteristic of a style of approach to AGI, a concrete example being the compression algorithm challenges.
1 reply 0 retweets 0 likes -
Does the same criticism apply to Goedel Machines, or Schmidhuber's curiosity metric?
1 reply 0 retweets 1 like
No, the Gödel machine seems to be correct. We just need to find the starting state.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.