"Generating Sequences with RNNs" by Graves https://arxiv.org/abs/1308.0850 This paper blew my mind when it came out, showing that it was possible to generate plausible text and handwriting with RNNs. Includes the predecessors of attention, Adam, etc... (2/5)
-
-
Prikaži ovu nit
-
WaveNet by van den Oord et al. https://arxiv.org/abs/1609.03499 Until this came out I don't think most of us expected that we'd be able to generate raw waveforms with deep networks anytime soon. The results were surprisingly good and the architecture remains influential. (3/5)
Prikaži ovu nit -
"Learning to Generate Reviews and Discovering Sentiment" by Radford et al. https://arxiv.org/abs/1704.01444 A simple and surprising result (thresholding a neuron in an unsupervised LM could classify sentiment accurately) that helped kicked off the transfer learning craze in NLP. (4/5)
Prikaži ovu nit -
"Implicit Autoencoders" by Makhzani https://arxiv.org/abs/1805.09804 A deeper cut, this paper (loosely speaking) proposes a VAE where both the "reconstruction" and "regularization" terms are replaced with adversarial losses. Impressive results on disentangling style and content. (5/5)
Prikaži ovu nit -
Novi razgovor -
-
-
Don't you think the papers you listed were from the authors who were already famous, and people followed their work, but for a new student at not so famous lab this is not an option, no one is going to read their arxiv preprint
-
Mostly agree; arxiv-only is a privilege often exercised only by those who are not under pressure to publish at conferences. I was making a different point - just that there have been great arxiv-only papers, so if your paper gets rejected, it doesn't mean it won't be influential!
Kraj razgovora
Novi razgovor -
-
-
Layer Normalization, by Jimmy Ba, Jamie Kiros, and Geoffrey Hinton (https://arxiv.org/abs/1607.06450 ). Crucial for Transformer to work, amount many others!
-
Good one! I thought this was at NeurIPS for some reason.
- Još 1 odgovor
Novi razgovor -
-
-
DCGAN. I was really surprised when I was searching for its bibtex entry as a proceedings publication.
-
DCGAN was actually at ICLR 2016! https://iclr.cc/archive/www/doku.php%3Fid=iclr2016:main.html …
- Još 3 druga odgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.