I swear there will be a proper pdf with math galore for NoGAN eventually. I do think it has a lot of potential for to make training better so for now we can at least tweet and blog about it!https://twitter.com/remykarem/status/1135183936227749888 …
Do you mean underfitting visually or by the training numbers? More training doesn’t get better results on the pretraining (afaik). The dull results are as good as it gets.
-
-
Just looks like the loss keeps dropping dramatically if you run for more epochs. I’m going to experiment more this week. I haven’t had a chance to dig in thoroughly yet; just ran through one of your notebooks a couple times last night to get a feel for what’s going on
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
NoGAN - a transfer learning technique in GANs:
G - generator
D - discriminator
1. Pretrain G using perceptual loss
2. Save generated imgs from pretrained G
3. Pretrain D as binary classifier
4. Train both G & D in a normal GAN setting