I thought @GaryMarcus aspired to be the new Dreyfus ("What DeepNets cannot do", "What DeepNets still cannot do" etc) but he is aiming for being the new Searle now: "A rule learned by a Deep Learning system is not really a rule." #AAAI2018
-
-
Replying to @Plinz @GaryMarcus
hmm... wonder if we can argue that "the black box" represents a subjective experience ;p
1 reply 0 retweets 0 likes -
Replying to @sd_marlow @GaryMarcus
Gary Marcus is not arguing that AI cannot have subjective experience or be generally intelligent. He just claims that it must be GOFAI to do so :) Every true AI must receive its soul from a conscious creator who lovingly handwired its symbolic reasoning!
3 replies 0 retweets 0 likes -
Replying to @Plinz @GaryMarcus
I meant that we lack an objective understanding of how the algorithms really work, and Searle might argue we can't because their subjective to the network. Marcus has said that DL could extract those symbols (making them objective). GOFIA or GOHOME ;p
2 replies 0 retweets 0 likes -
Replying to @sd_marlow @GaryMarcus
That has not been true for a long time now. We know how these algorithms work to approximate functions, and what classes of functions they approximate, and how they are represented.
1 reply 0 retweets 0 likes -
Replying to @Plinz @GaryMarcus
Life can't be reduced to a utility function. One algorithm to rule them all. As people poke at these systems, human minds become part of the feedback loop. The functions, arrays, whatever, have no innate ability to alter themselves using that kind of external reference.
1 reply 0 retweets 0 likes
Life is just cells gobbling up negentropy. Cells are interesting, but not mysterious. Minds are universal computational machines in the service of regulation problems of groups of cells in the context of optimizing for inclusive fitness under conditions of multilevel selection.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.