"Ever bigger steam engines burning ever more coal is the superior approach to build faster trains", they said. More data + scaling up simple algos will get you better models, obviously. But there are diminishing returns -- meanwhile, new approaches can be transformational
-
-
But in 2050 imagine how much more text data we'll have and how many layers we can train
. Jokes aside things will need to seriously change to get to the sci-fi levels everyone thinks are around the corner. -
To think we can sum up the meaning of a word with 300 numbers we find by checking what words are next to it is kind of absurd.
- Show replies
New conversation -
-
-
This Tweet is unavailable.
-
Studies of the source of speedup across many areas has shown that roughly half of the improvement is due to hardware and half due to software. In some cases, algorithm advances have been much more important than hardware.pic.twitter.com/Zh3BOHsHra
- Show replies
-
-
-
I think the breakthrough might be more at the theoretical level. Most of NLP is driven by the distributional hypothesis at the moment. Similar low level research might yield fruit, instead of stacking more effort at the higher levels.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
No we aren’t (re: closeness to maxing out the data). A few dozen gigs were used for GPT-2.
-
True that. GPT-2 was trained on 8M documents. Wayback Machine hosts 388B web pages.
End of conversation
New conversation -
-
-
This , I do think, because understanding human language serve no purpose for IA yet I mean, what could the loss function be for «understanding » human language ?
-
Mostov our langage modeling is based on cross entropy aka retreiving same distribution of words of the original corpus Even a seq2seq just find the most probable answer given distribution of the input sequence
End of conversation
New conversation -
-
-
@LordSomen True. I feel AI is so much more than current applications.Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.