And I would argue most of the brain is devoted to this kind of adaptive control of behavior. And it would be super cool to build AI like that - as artificial agents...
-
-
Replying to @WiringTheBrain @TonyZador and
But is that what AI folks want? Or are they after the ability to perform abstract reasoning and other really intellectual operations better than humans can?
2 replies 0 retweets 2 likes -
Replying to @WiringTheBrain @ylecun and
Indeed, what AI folk want is abstract reasoning. They tried jumping straight to it under the Minsky Symbolic AI program, which failed. Maybe we can jump straight to human cognition with ML, but i'm skeptical. I think we need to pass through mouse intelligence. Short jump to human
5 replies 2 retweets 9 likes -
Replying to @TonyZador @WiringTheBrain and
a. premature argument to say "symbolic AI" failed (using < .01% current compute) when deep learning could have been dismissed in same way in 2009 b. ignores hybrid neurosymbolic models. c. it's not that short a leap, inasmuch as vast majority of mammals did not evolve language
2 replies 1 retweet 3 likes -
Replying to @GaryMarcus @WiringTheBrain and
Yes, most animals didnt evolve language. But language evolved in a blink, ~200Kyrs (or 1Myrs, if you think Neanderthal and Denisovan had language). And population sizes were small, and generations long, which suggests it's a very easy evolutionary step for apes to get language.
1 reply 0 retweets 4 likes -
Replying to @TonyZador @WiringTheBrain and
- it’s not that easy or more primates would be talking; the adaptive advantage is likely huge - but yes the only way to evolve language quick is to already have a genome packed chock full of innate tools that are good for cognition. that was key point of The Birth of the Mind
2 replies 0 retweets 2 likes -
Replying to @GaryMarcus @TonyZador and
So, then what are those elements of cognition that brains have that AI doesn't? Is it the architecture for predictive processing? Is it that they incorporate value and meaning through experience?
2 replies 0 retweets 1 like -
Replying to @WiringTheBrain @TonyZador and
again my opening bid is the list of 10 things here: https://arxiv.org/abs/1801.05667
1 reply 0 retweets 2 likes -
Replying to @GaryMarcus @TonyZador and
Okay, these all seem useful:pic.twitter.com/KSTcTnLpCo
3 replies 1 retweet 5 likes -
Replying to @WiringTheBrain @GaryMarcus and
Side note: "translational invariance" is a fallacy brought about by fundamental misunderstanding of attentional processes and context handling.
1 reply 0 retweets 1 like
hunh? translational invariance is routinely implemented naturally and straightforwardly by convolution, based on good empirical evidence, initially gathered by @ylecun
there may of course be other ways to do it.
-
-
Replying to @GaryMarcus @WiringTheBrain and
That it works in implementations with near zero resemblance to natural neural networks isn't an argument in favor. Given the constraints of biological evolution and development, I believe the empirical evidence has been misread.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.