This is a cool way to build highly specific prior knowledge into neural nets: via highly specific loss functions https://arxiv.org/abs/1609.05566 Don’t just maximize likelihood of the data or predict the next frame—build all the constraints you can know into the loss and make NN satisfy.
-
-
GANs, of course, are all about learning incredibly specific loss functions.
-
Very much so, and indeed they followed up the above with https://arxiv.org/abs/1805.10561 which uses something like a GAN to learn these types of constraints, like from a simulator that obeys physics we want our NN trained on real movies to respect
- 1 more reply
New conversation -
-
-
+ a specific modular architecture?! Or not?
-
Maybe yeah.
End of conversation
New conversation -
-
-
Yes, exactly this!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.