There are theoretical impediments that even "general methods" cannot circumvent. If you search for a common point of two parallel lines, you can have the best search-and-learning method in the world and you won't find one. ML confronts such impediments to reaching GAI. #Bookofway
-
-
That's the whole debate, isn't it? Is the human ability to learn things like mathematics, geometry, formal logic, informal reasoning, and science the result of some ensemble of general search and/or learning processes operating in the brain, or something else?
1 reply 0 retweets 2 likes -
nobody doubts search and learning are important, but those are so broad as to encompass almost everything that has been or will be tried. shocking as it may sound, http://rebooting.ai strongly endorses learning, though of a form that is more compatible w abstract knowledge.
1 reply 0 retweets 3 likes -
Wouldn't the disagreement between your Rebooting AI thesis and Richard Sutton's Bitter Lesson thesis be that Sutton believes the answer is to keep working on general learning methods like reinforcement learning and that you believe the answer is to pivot to symbol manipulation?
2 replies 0 retweets 0 likes -
Replying to @strangecosmos @GaryMarcus and
I don't think Sutton's argument is that if you add 1 million times more data and compute to existing machine learning techniques you'll get AGI. I think it's that researchers should work toward ML techniques that can efficiently leverage 1 million times more data and compute.
1 reply 1 retweet 0 likes -
Why speculate on what Sutton means or meant? Do you
@strangecosmos believe that any ML technique can solve any of the toy problems in#Bookofwhy or Primer, given all the data in the world, and no information beside data?1 reply 0 retweets 2 likes -
I can’t say what ML techniques will exist in 50 years, so I don’t know. I think Sutton’s essay (http://www.incompleteideas.net/IncIdeas/BitterLesson.html …) is pretty clear. I’m not aware of any ML expert who thinks no further fundamental research progress is needed to get to AGI, just more data and compute.
2 replies 0 retweets 0 likes -
If by "ML techniques" you mean any future algorithm, then you are justified in saying "I dont know". But if by "ML techniques" you mean algorithms based on data only, we can tell even today that the answer is NO. We can't compute 3-dim volume from a 2-dim shadow.
#Bookofwhy1 reply 0 retweets 2 likes -
I don’t think there is a good formal definition of “learning” that captures what people like Sutton and LeCun mean by the term. The way I try to understand the debate is that it’s about how much knowledge should be baked into a system versus learned from data or experience.
1 reply 0 retweets 0 likes -
Replying to @strangecosmos @yudapearl and
Depends what you mean by "knowledge". "Mechanisms" might be more accurate. The analogy I see is to a toolkit full of tools, each w/ specific uses, which can be combined in many different ways to solve a wide array of tasks. "Knowledge" does not seem to me clearest word for this!
2 replies 0 retweets 0 likes
the argument of http://Rebooting.AI is that we need both. we go through many, many detailed examples.
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Replying to @GaryMarcus @jprwg and
It's fallacious to claim that we need prior knowledge about the world to achieve AGI. All the assumptions we need should be in the sensing and learning mechanisms. The only prior knowledge babies are born with have to do with things like mating and eating. They learn the rest.
1 reply 0 retweets 0 likes - 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
