@GaryMarcus' nativist argument against statistical learning AI seems to rest on the premise that evolution is fundamentally different from a statistical learning paradigm.
This is false (evolution IS a form of statistical learning), but I am grateful that Gary makes that point.
-
-
Replying to @Plinz @GaryMarcus
I would argue biological evolution is Not a form of statistical learning - since there is no learning - but pure selection, among random changes in the genomic code and interaction of biomolecules, completely determined by specific dependencies/constraints of the environment.
1 reply 0 retweets 2 likes -
Replying to @Derya_ @GaryMarcus
Learning is the search for better function approximations, and so is evolution. Gradient descent works only when you can identify a gradient. Otherwise you are stuck with evolutionary methods.
1 reply 1 retweet 3 likes -
Replying to @Plinz @GaryMarcus
That’s my point that in biological evolution there is No search for better approximations - only selection out of many many random possibilities - and only on basis of selective survival advantage and not even necessarily good solutions...
1 reply 0 retweets 1 like -
Replying to @Derya_ @GaryMarcus
I think of evolution of minds mostly as a search for meta learning systems. I suspect that the human mind is the first fully general function approximator (we are the first to use externalizable Turing complete languages), and I agree with Gary that evolution fixes many priors.
2 replies 8 retweets 19 likes -
What do you mean by using Turing-complete languages? Every physical system including the human brain is bounded; no TM is there.
2 replies 0 retweets 0 likes
The problems we want to solve usually don’t need a lot of resources. Elementary physics, general AI, good governance etc. are not very many orders of magnitude out of the reach of even unaugmented brains.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.