It is remarkable. However... While Searle would have loved it, there is no understanding going on here. (Sigh. IMHO an intelligent system really has to build models of the world/others/self for that...otherwise you are just regurgitating symbols).
-
-
- 2 more replies
New conversation -
-
-
Gary my reasoning: the model has 1.5B parameters, roughly 5Gb, and they trained it on 40Gb of text in which there are obviously some repetitions. The degree of compression is small, the model may really well be almost memorizing everything.
-
I'd agree with that. There does seem to be some articulation, i.e. insertion of noun phrases but with text that large, it is very hard to be sure this isn't entirely "pastiche".
End of conversation
New conversation -
-
-
it's somehow similar to melody generation (works by
@francoispachet@boblsturm@douglas_eck). phrases which have a coherent local structure and that don't seem to be going anywhere at a higher level. whereas people and musicians seem to plan ahead, even when improvisingThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Does that count as plagiarism?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.