@GaryMarcus' nativist argument against statistical learning AI seems to rest on the premise that evolution is fundamentally different from a statistical learning paradigm.
This is false (evolution IS a form of statistical learning), but I am grateful that Gary makes that point.
-
-
Replying to @Plinz @GaryMarcus
I would argue biological evolution is Not a form of statistical learning - since there is no learning - but pure selection, among random changes in the genomic code and interaction of biomolecules, completely determined by specific dependencies/constraints of the environment.
1 reply 0 retweets 2 likes -
Replying to @Derya_ @GaryMarcus
Learning is the search for better function approximations, and so is evolution. Gradient descent works only when you can identify a gradient. Otherwise you are stuck with evolutionary methods.
1 reply 1 retweet 3 likes -
Replying to @Plinz @GaryMarcus
That’s my point that in biological evolution there is No search for better approximations - only selection out of many many random possibilities - and only on basis of selective survival advantage and not even necessarily good solutions...
1 reply 0 retweets 1 like -
Replying to @Derya_ @GaryMarcus
I think of evolution of minds mostly as a search for meta learning systems. I suspect that the human mind is the first fully general function approximator (we are the first to use externalizable Turing complete languages), and I agree with Gary that evolution fixes many priors.
2 replies 8 retweets 19 likes -
Replying to @Plinz @GaryMarcus
That I agree. Brain and the immune system are meta learning systems, as they can store memory and learn from experience.
1 reply 0 retweets 1 like
Meta learning implies that a system can figure out how to learn things by itself.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.