Seems like you're moving the yardstick here. It's coherent wrt entities often over whole paragraphs. Certainly that falls into the camp of things "deep learning shouldn't do". Obviously it's not solving the common sense problem, but it's hard to argue that it's not a step forward
-
-
-
if i did same with n-grams where n > 50, yielding similar local coherence, would you insist that I count that as progress? we may not have had the data or compute in 1960s, but we have long known that local quasi-coherence can be captured that way - without real understanding
- 9 more replies
New conversation -
-
-
Then why is it getting decent Winograd scores (a test you originally made me aware of when pointing out BERT's shortcomings)? I asked if one could just throw 10x more data at it and you indicated that wasn't feasible. Are we there yet?https://twitter.com/bensprecher/status/1071651722181849088?s=19 …
-
Decent, not great, because of some modest correlations? Hard for me to say much without seeing any of the data and without access to the model. Would be interesting to see if it asymptotes or improves with eg 10x or more data.
- 2 more replies
New conversation -
-
-
"..simultaneously close and not close to Searle's room." We have gone quantum...it's now Schroedinger's room. PS: have you written about your thoughts on Searles room?
- 1 more reply
New conversation -
-
-
How can we be "close and not close to Searle's room?" it's either a program and thus the Chinese room or not. Seems like a program to me.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Your comments beg the question: does an intelligent being need to have understanding?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
Show additional replies, including those that may contain offensive content
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.