say more?
But also because it represents a real advance -- among other things, using Transformer architectures, which are not conventional neural nets and are actually much more symbolic in their inductive biases arguably.
-
-
Oddly enough, though, Transformer came out of trying to improve performance on relatively conventional NLP tasks / language modeling, which neglects commonsense grounding and world model building.
-
Likewise, many important unsupervised learning methods evolve from training on ImageNet. Indeed, we got out of an AI winter because of ImageNet.
- 5 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.