This is crazy if true, especially for interactive applications. Never expected this much potential progress this quickly.
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Just getting close to FNN performance without the "loss of novelty" problem would be significant; this BEATS them tho?!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
the fact that one of the authors originated LSTM certainly doesn't hurt its chances of being a breakthrough
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
That appendix though...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Plans to include it in Keras?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Add it to Keras!
- Show replies
New conversation -
-
-
Maybe a super human Relational Network could help us understanding the Appendix
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
My SELU experiments in Kerashttps://github.com/bigsnarfdude/SELU_Keras_Tutorial …
-
So ReLU is still better? ReLu loss after 12 epochs is 0.0275088263036 while SELU never gets this low even after 200 epochs.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.