I'm going to make a big claim:
Libraries like sklearn (#python), caret (#rstats) and MLJ (#julialang)
are never going have great APIs for working with models like deep neural networks, or any other kind of model, where the hyper-parameters are nontrivial functions. And that is Ok
-
-
The kind of API you want for things like RandomForests and SVM has a fixed number of parameters and ways they can be changed. Once your parameters are functions then it is much harder to have full flexibility. But keeping it simple is good! Not everyone needs the full flexibility
-
I agree and disagree at the same time. SVM and Random Forests are simpler model, and this made it easier to conflate different concepts (the model, the training algorithm, the configuration) in a single object. You can't get away with it once stuff gets more complex (see NN).
- 1 more reply
New conversation -
-
-
The obvious solution is to allow the passing in of a function object (lambda) for parameters like neural architecture. In Python and R this is no go. Since they need to hit a C backend that doesn't want to do a slow call back into python/R. Some hope for other languages here.
-
Here I stand with you: it's a technological problem. MLJ, given that it's written in
#julialang, could bring some cool innovations to the table, API-wise. There is some movement in@rustlang land as well, but we are quite far from a working prototype.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
Writing about stuff to learn how it works, mostly in Rust.
Lead Engineer at