This is a cool way to build highly specific prior knowledge into neural nets: via highly specific loss functions https://arxiv.org/abs/1609.05566 Don’t just maximize likelihood of the data or predict the next frame—build all the constraints you can know into the loss and make NN satisfy.
-
Show this thread
-
Adam Marblestone Retweeted jovo
Related to this conjecture:
@neuro_datahttps://mobile.twitter.com/neuro_data/status/1204038229273518083 …Adam Marblestone added,
1 reply 0 retweets 2 likesShow this thread -
This is also what
@KordingLab and Greg and I meant with highly specific evolutionarily programmed “bootstrap cost functions” being the things brain would optimize if it does some powerful online optimization during organism lifetime a la DL2 replies 0 retweets 3 likesShow this thread -
Replying to @AdamMarblestone @KordingLab
You're saying that the brain (or evolution) is also optimising for the "usefulness" of auxiliary objectives?
1 reply 0 retweets 1 like
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.