How about applying this in NLP?
-
-
-
In fact I once generated many sentences using GAN, but not as illustrative as this
-
Anime GAN is a work by
@gwern, but I have also attempted to generate text by GAN. Empirically, I think that SeqGAN which makes G learn by policy-gradient works better than WGAN which makes G differentiable when it generates long sentences with large vocabulary size. -
Sequence-generation with GANs is still a big open problem AFAIK. Might not be necessary since self-supervised prediction works so well, as GPT-2 strikingly reminded us yesterday. (I also have a theory that prediction loss + RL finetuning like https://arxiv.org/abs/1706.03741 would work.)
-
I agree. Transformer LM and VAE-based LM seem to be more stable and better quality than GANs.
-
Great! I'll try that.
End of conversation
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This is what happens when an eldritch being tries to take a a form your mind can comprehend.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@MVBras93, ENDLESS WAIFUS -
THE GREATEST INVENTION IN MANKIND HISTORY!
- 1 more reply
New conversation -
-
-
Wow, that was a trip.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Konosuba! My favorite animay!
-
Yeah me too !
End of conversation
New conversation -
-
@Squid2L DROP EVERYTHING -
Evolution of best girl?
End of conversation
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Elon gonna like this one
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.