My understanding is that DNN trained end-to-end with backprop are not robust to removing most or all of a layer, even if there are parallel streams that converge and you dial one of them down.
-
-
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Well dropout (randomly disabling 15% to 50% of neurons in a layer) plays an important role in neural net robustness.
-
True! But dropout doesn't make the architecture robust to losing an entire layer/module.
- Još 1 odgovor
Novi razgovor -
-
-
That just tells us here are many distributed nets for specific tasks, and some association nets to manage the integration of them. Doesn’t say much about how the networks are trained, imho.
-
Agreed: like this https://arxiv.org/abs/1701.06538
- Još 15 drugih odgovora
Novi razgovor -
-
-
Maybe it's not so much the learning rules but rather a DNN model is an inappropriate description of the brain! The brain is a *dynamical system*, not just a feedforward hierarchy...
- Još 2 druga odgovora
Novi razgovor -
-
-
Is this why I get lost whenever it is below 50 degrees? My hippocampus freezes?
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.