Sure, but you do not understand Riemann sums innately - you learned about them. Humans clearly have some innate ability to learn about abstract systems that other species may not, but the learning is innate, not the concepts and relations themselves.
-
-
Replying to @tyrell_turing @neurograce and
That's irrelevant though. Humans didn't evolve to learn those things. So there's unique innate machinery that's been repurposed. Eg permission schemas for abduction. Machinery for reasoning spatially with a graph based representation (easily confused for grids) for symbolic, coup
1 reply 0 retweets 1 like -
Replying to @sir_deenicus @tyrell_turing and
coupled with whatever allows us to learn recursive grammars and combinatorially compose atomic concepts. Once you have that, getting to a system that can derive Newton's Laws by looking at how things fall is a comparatively short step.
1 reply 0 retweets 1 like -
Replying to @sir_deenicus @tyrell_turing and
I think for example, the fact that cerebellum and motor cortex is recruited for even mathematical reasoning--and one may quibble over how innate is defined--suggests that at least the basis and initial stages are by analogy with/to innate capabilities.
1 reply 0 retweets 1 like -
Replying to @sir_deenicus @tyrell_turing and
I'm saying: brain does things it didnt evolve to, such as math or sewing. It's much more likely that existing nearby machinery was repurposed and then optimized in humans to be able to do or pick up recursive abstract reasoning than to have that capabiliity learned from scratch
2 replies 0 retweets 2 likes -
Replying to @sir_deenicus @neurograce and
Uh huh... I don't see how anything I said in this thread contradicts anything you've said here. My point is only: (1) there are some structural priors, but (2) we learn a lot, (3) we use distributed representations, and (4) this all fits with the general DL research program.
2 replies 0 retweets 3 likes -
Replying to @tyrell_turing @neurograce and
More degree than contra. I think instead, high precision to go from PushDA to TM. Such as,what is the arch that allows small toddlers but not adult chimps to pick up language? I am doubtful we learn a lot because developmental trajectories, common fallacies are too much shared.
1 reply 0 retweets 0 likes -
Replying to @sir_deenicus @tyrell_turing and
I think, if just us humans have same limitations, strengths & develop in the same sequence, lack of dof strains what's meant by learn Specifically, I think existing unlearned machinery underlies our ability in say math. We learn to learn,fine-tune, learn facts & rules but
1 reply 0 retweets 0 likes -
Replying to @sir_deenicus @tyrell_turing and
but underlying manipulations allowing for proofs & derivations unlearned. Same underlying likely used in math,dance,art,writing etc. Candidates are foraging specializations, permission schemas (see: http://bactra.org/reviews/hhnt-induction/ …), human specific grammar learning specializations.
1 reply 0 retweets 0 likes -
Replying to @sir_deenicus @tyrell_turing and
I think a lot of the things AI researchers are trying to get AI to do still fall within the realm of things non-human primates & other animals can do. So not sure how relevant human-specific skills are at the moment
1 reply 0 retweets 1 like
Even some of the stuff nonhumans can do is pretty impressive, eg tool use by crows, primate swinging between branches in dynamically changing environments
-
-
Replying to @GaryMarcus @neurograce and
The dynamics of fish swimming upstreams is also pretty cool
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.