@fchollet this is super cool, will you publish more research on the technique or just this code? i'd love to see a more thorough analysis.
-
-
-
@fchollet i'm also curious if normal relu layers are trying to learn something like this internally sometimes... - Show replies
New conversation -
-
-
@fchollet could you give any hint, please, what this means? https://github.com/fchollet/keras/blob/master/examples/antirectifier.py#L38 … -
@fchollet my bad, I was thinking about smooth rectifier
End of conversation
New conversation -
-
-
@phufaz you have the code, run the tests. It takes a few minutesThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@fchollet Interesting! Tried something similar with a 'sign' vector learned (tanh) + 'data' vector (rectify) in another context (recurrent)Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@fchollet Wow, gotta try this out. Would you reckon it would produce levels of accuracy similar to PReLU? -
@_AntreasAntonio haven't compared it to prelu, etc. Just relu. - Show replies
New conversation -
-
-
@fchollet Interesting theoretical justification. Seems similar to idea behind 'prelu'. I'll be curious to see how their performance comparesThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@fchollet I've done this as an input feature transformation before but it never occurred to me to stack them up in a NN. Cool idea!Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.