Antirectifier --a way to get rid of activation functions altogether. https://github.com/fchollet/keras/blob/master/examples/antirectifier.py …
@_AntreasAntonio haven't compared it to prelu, etc. Just relu.
-
-
@fchollet interesting! here are accuracies swapping different activations into the gist: 0.9845 antirectifier 0.9842 prelu 0.9838 relu -
@fchollet although if I read the train/test output correctly, it looks like these models may all be overfitting - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.