Generality in AI (which a matter of degree, not an absolute) is an v. important problem. Yet, strangely, the people who say they work on "AGI" don't seem to be interested in understanding the problem of generality, & instead focus on achieving task-specific skill by scaling up DL
-
-
This is making it nearly impossible to have a conversion on this topic. Many people now think of "intelligence" as an intrinsic property of an algorithm, synonymous with "power over the world", that can be tweaked to reach arbitrarily high values. Like the height of a building
Show this thread -
And many people are, accordingly, waiting for some kind of philosopher's stone, or lamp genie, of infinite intelligence and therefore infinite power. They don't quite know how to create it, but presumably all you need to do is figure out the correct series of incantations
Show this thread -
Never mind that current AI algorithms have very close to zero flexibility and breadth today. We are at the "task-specific skills" stage (i.e. local generalization), and will stay there until we directly tackle to question of engineering broad, reusable cognitive abilitiespic.twitter.com/jkdH7Sdh6X
Show this thread
End of conversation
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
.. or that the 80 or so definitions of intelligence floating around are mostly worthless as a standard.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
