. @ilyasut By cherry-picking examples The Economist piece misrepresented GPT-2’s capabilities. 2 questions for you:
1. Will you speak up to clarify the record?
2. When you say below interview “makes sense” do you mean to say that GPT-2 understands language? if not, why not?https://twitter.com/ilyasut/status/1199036860934193152 …
I don't think you nailed it, but you are close. GPT-2 is deep but not deep enough. It connects tokens over large distances, but it cannot yet converge on a unified model. It is unclear if a large enough repository of text is sufficient, but with video it may work.