I swear there will be a proper pdf with math galore for NoGAN eventually. I do think it has a lot of potential for to make training better so for now we can at least tweet and blog about it!https://twitter.com/remykarem/status/1135183936227749888 …
-
-
Replying to @citnaj
I will get round to trying it at some point! Is it set up as a jupyter notebook in Deoldify github?
1 reply 0 retweets 2 likes -
Replying to @CpnTaters
NoGAN training is, correct. Just make sure to read the instructions- it's not 100% top down execution.
1 reply 0 retweets 1 like -
Replying to @citnaj @CpnTaters
I ran into a few errors but I’m pretty sure it’s because I was doing something wrong. Where’s the best place to post for discussion/pointers once I inevitably get stuck? Github issue?
1 reply 0 retweets 0 likes -
Replying to @braddwyer @CpnTaters
If it's something you think is useful for others, sure that'd probably be the best place. Otherwise feel free to dm me!
1 reply 0 retweets 2 likes -
Replying to @citnaj @CpnTaters
I’ve been having trouble getting the repeatable GAN process to run the second time. Still working on investigating. But I’ll follow later this week.
3 replies 0 retweets 0 likes -
Replying to @braddwyer @CpnTaters
I carried over a habit from fastai notebooks which may not necessarily be great for this case which is that you have to run some cells carefully before doing the repeat. Namely- you'll need to make sure data_gen is initialized under 64px again. That part is dumb and I know it.
1 reply 0 retweets 0 likes -
Replying to @citnaj @CpnTaters
If I recall correctly it was that when I switched that part from 0 to 1 it wasn’t finding the 0 checkpoint it had tried to save.
2 replies 0 retweets 1 like -
Replying to @braddwyer @CpnTaters
Ohh that’s right. So what you need to do is look for the set of weights that work of all the checkpoints that are saved. Select it, and rename in to what you need here. This is the pain in the ass part of NoGAN at this point. Larger batch size helps a lot in stabilization.
4 replies 0 retweets 1 like -
Replying to @citnaj @CpnTaters
Awesome! I’ll try that. One more question while I’ve got you: why only 1 epoch each on the pre-training? Looks like it’s really underfitting; wondering if that was by design.
2 replies 0 retweets 0 likes
Also keep in mind- I use Imagnet (and am adding more datasets). That one epoch now contains 2M pictures.
-
-
Replying to @citnaj @CpnTaters
That makes sense; I have a lot less training data right now (although I’m running a script to segment it now so I should get ~5-10x more. If I wanted to start from your endpoint rather than training from scratch are those weights available in the repo? Or only the final ones?
1 reply 0 retweets 1 like -
Replying to @braddwyer @CpnTaters
Yes the pretrained weights are in the repo as well. That should work well for you.
1 reply 0 retweets 1 like - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
NoGAN - a transfer learning technique in GANs:
G - generator
D - discriminator
1. Pretrain G using perceptual loss
2. Save generated imgs from pretrained G
3. Pretrain D as binary classifier
4. Train both G & D in a normal GAN setting