I swear there will be a proper pdf with math galore for NoGAN eventually. I do think it has a lot of potential for to make training better so for now we can at least tweet and blog about it!https://twitter.com/remykarem/status/1135183936227749888 …
-
-
Awesome! I’ll try that. One more question while I’ve got you: why only 1 epoch each on the pre-training? Looks like it’s really underfitting; wondering if that was by design.
-
Do you mean underfitting visually or by the training numbers? More training doesn’t get better results on the pretraining (afaik). The dull results are as good as it gets.
- Show replies
New conversation -
-
-
Looks like it was a sneaky CUDA OOM error causing a cell to fail that I didn't notice because the error was in between a bunch of the the "ONNX export failed on ATen operator mv" warnings. Training is going smoother now that I can actually repeat the repeatable process!

-
I should have kept my big mouth shut. As soon as I tweeted that CUDA ran out of memory again!
End of conversation
New conversation -
-
-
I'm not sure I understand what you mean. How do I check which ones work? I've got a list of weights like this from run 0 that I'm not sure what the _#### at the end represents:pic.twitter.com/xtfZO3MM7o
-
You visualize them. You can plot sample images from them using one of the visualization notebooks and modifying accordingly to read the model checkpoint. Like I said- PITA. I don’t have any other way of figuring out the stopping point yet.
End of conversation
New conversation -
-
-
So need to save this thread to come back too
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
NoGAN - a transfer learning technique in GANs:
G - generator
D - discriminator
1. Pretrain G using perceptual loss
2. Save generated imgs from pretrained G
3. Pretrain D as binary classifier
4. Train both G & D in a normal GAN setting