"Ever bigger steam engines burning ever more coal is the superior approach to build faster trains", they said. More data + scaling up simple algos will get you better models, obviously. But there are diminishing returns -- meanwhile, new approaches can be transformational
-
-
At this point we are pretty close to training our language models on the entirety of text corpora that humanity has produced so far. Their language understanding capabilities are still close to zero.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Lets hear some speculation! could it still be powered by keras?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
On the other hand, it's 2019 and I'm still on a Unix-like operating system written in C.
- End of conversation
New conversation -
-
-
All hail genetic algorithms and reinforced learning!
-
these approaches are even less data efficient than sgd,
- Show replies
New conversation -
-
-
Cannot agree more. Mad rush toward more and more data and customized hardware based on 20-25 year old methods with obvious potentially fatal flaws...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
We could be stuck in this local minima for a long time.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
You are right
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Sounds like yet one more revolution in ML. But what alternative for SGD and stacks may be in 2050s? Sorry, if it silly question.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.