this should be the exact same as the setup you describe: https://gist.github.com/fchollet/ac238179b7752d24717368c45a8db2a7 … goes to 1.8-1.7, down to 1.4 when adding dropout.
-
-
ok, basically this handles the ML parameters as an optimization problem using genetic algos.
-
btw arguably random forests or gradient boosting is a better fit for black-box ML compared to NNs.
End of conversation
New conversation -
-
-
I want to go the extra mile, by only addressing very easy problems for which default parameters can kinda work
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
my feeling is that even a basic NN can often discover better patterns in live data than what 90% users are doing now: guessing.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
yep, and indeed those others ML tools are most stressed by the current ML “products” that I see around.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I just never expected that incremental learning was so hard with NNs. I’m trying the pseudo-patterns approach right now.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Btw… just an example to make my point less blurred. with MNIST to have an error rate of 5% is terrible compared to state of art.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Yet, you can get 5% error rate trowing at it an NN optimized in a completely wrong direction. Now imagine to have a tool like that
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
people can incrementally feed data to make regressions and classifications with big bug acceptable error rates in a trivial way.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
that’s why I believe there is a value here.
-
there is. But making it work even just a little requires *some* prior understanding of the inputs and targets. And that's difficult
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.