Bingo. So many people think they can do NLU (or robotics, or even driving) without having a commonsense knowledge database, and I just don’t see how that’s remotely possible.https://twitter.com/punkstrategy/status/1071560512662515712 …
-
-
Replying to @GaryMarcus
Why a "database"? How is all of the incredible work on embeddings not storage and representation of knowledge? Just because it's in a form that we humans can't directly interpret (like, need I point out, is true for our OWN brain's knowledge "database") doesn't mean it's garbage
1 reply 0 retweets 1 like -
Replying to @bensprecher
a. did i say it was garbage? or just not adequate for the task? b. have a look at BERT, best embeddings Google can buy, still can’t do Winograd Schemas or discourse comprehension c. knowledge base would have been better; i don’t literally mean SQL etc
1 reply 0 retweets 2 likes -
Replying to @GaryMarcus
BERT is achieving state of the art performance on a broad range of tests, but not - you are correct - the GLUE WLNI due (at least implied in the paper) to an issue with the construction of the data set. Taking a step back, I think you are looking for logical symbolic 1/
1 reply 0 retweets 0 likes -
Replying to @bensprecher @GaryMarcus
operations on identifiable entities to underpin "true" NLU. Assuming I understand you, why couldn't instead that logic occur as learned NN transformations on embedded representations of concepts and entities? It may require an order of magnitude more training data /2
2 replies 0 retweets 0 likes
see the argument i made in chapter 3 of algebraic mind re training space, encoded in Bengio’s recent baby paper. you won’t get enough data.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.