This is a cool way to build highly specific prior knowledge into neural nets: via highly specific loss functions https://arxiv.org/abs/1609.05566 Don’t just maximize likelihood of the data or predict the next frame—build all the constraints you can know into the loss and make NN satisfy.
-
Show this thread
-
Adam Marblestone Retweeted jovo
Related to this conjecture:
@neuro_datahttps://mobile.twitter.com/neuro_data/status/1204038229273518083 …Adam Marblestone added,
1 reply 0 retweets 2 likesShow this thread -
This is also what
@KordingLab and Greg and I meant with highly specific evolutionarily programmed “bootstrap cost functions” being the things brain would optimize if it does some powerful online optimization during organism lifetime a la DL2 replies 0 retweets 3 likesShow this thread -
I think it is also obvious to
@tyrell_turing that this is the level of specificity of loss function evolution could build in for brain to optimize, if it “does DL” — things like this, but specific to brain area and developmental stage, and specific animal’s ethological needs1 reply 0 retweets 2 likesShow this thread -
When one speculates about brain as some kind of DL type system, it does not mean having architecture or loss function similar to current systems: to me, it means a) powerful optimization alg. on during organism lifetime + b) many highly specific evolved loss fxn recipes like this
3 replies 0 retweets 4 likesShow this thread -
Replying to @AdamMarblestone @tyrell_turing
+ a specific modular architecture?! Or not?
1 reply 0 retweets 1 like
Maybe yeah.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.