I know @garymarcus is arguing for making smarter AI. But the alternative, to "greatly restrict what the machines are allowed to do" deserves serious consideration. Do we need machines with general intelligence in a world of billions of people who possess such intelligence?https://twitter.com/nytopinion/status/1170155408394194944 …
bee level intelligence would indeed be a major advance, but i don't think be intelligence is *general*. Artificial general intelligence doesn't have to be human-like, but it does have to be general (ie not narrow).
-
-
The folk that make a machine that can autonomously deal with problem-solving and novelty handling at bee level.... That's the folk to bet on. We could argue about what 'general' means, I suppose. But I'd rather see empirical work targetting bee level instead! Call me old school.
-
I'm unconcerned with definitions. If a bee level machine is a goal of developer X, I'd buy that because 99% of the same problems would be solved, and human level AGI would largely be a matter of scaling. I guess I'm not in the 'we' you speak of. Never mind.
- 2 more replies
New conversation -
-
-
Generality implies universality, not a hybrid kludge. There's a reason that the brain uses millions of cortical columns and that the same type of cortical column is found in all sensory cortices. A single universal principle/mechanism is used to handle all sensory modalities.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.