this was a joke but this is also true what deepmind here took very little compute compared to what is commonly being done with text and from what I can tell nothing new on the architecture side all these tools were around, it "just" took good engineeringhttps://twitter.com/alth0u/status/1333498820508807169 …
-
-
haha maybe so if you prune down the space to "applied ML" but i think there are lots of theory advances left in Deep RL, novel architectures etc some of these will become clearer on the next compounding of compute power
-
i continue to believe that the interplay between cognitive neuroscience and deep learning will be one of the great flywheels of 21st century science and i'm not sure that we're close to the end
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.