Here's something a lot of people still don't know: The latest DeOldify doesn't use GANs anymore. And I'm not being cute with terminology- NoGAN isn't used either. We needed something more production worthy and controllable and it just wasn't cutting it. 1/
-
-
It was a good thing I got past my ego (after some time) and listened to
@jeremyphoward and@fastdotai when they said they that they are getting "better than GAN" results using perceptual loss in super resolution :) 3/Show this thread -
I'm trying to straddle the line between maintaining a competitive advantage and giving back to the community that served us so much. It's tough to make the call sometimes. 4/
Show this thread
End of conversation
New conversation -
-
-
Yes, same observation we made on image translations, style transfer and super resolution applications in our BMVC19 paper https://arxiv.org/abs/1908.00274
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's a small space of choices. Telling what it's not is almost isomorphic to telling what it is. If it's not GANs then it's some variant of VQ-VAE. True?
-
Ha I have to be careful to not say too much but I left clues elsewhere.
End of conversation
New conversation -
-
-
Even knowing there's an alternative out there is very valuable, because it's a big effort commitment to go look for it.
-
Yeah that's what I figure. Also it bears repeating: fastai is ahead of the game :)
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.