JAX is a really fun low-level library for ML hacking: https://github.com/google/jax It's basically Numpy with gradients. And it can compile to XLA, for strong GPU/TPU acceleration. It's an ideal fit for researchers who want maximum flexibility when implementing new ideas from scratch.
-
-
Absolutely. Early days of TF & your first sentence is exactly how we used to describe it :) "TF is a really fun low-level library for ML, but it's not as good as theano (yet)"
-
I'm excited to see where JAX goes - it's looking great already! What's the status of this at Google? Is it a personal side-project, or something that's likely to stick around for a long time?
- Show replies
New conversation -
-
-
Yes please! Would be happy to contribute.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It’s a great concept, and I’m following it closely. But there are a number of issues which makes it lag behind TF. The biggest one being the lack of a good vmap function. It’s planned, but many yet-to-be implemented operations. Maybe the Keras team or TF team will help with dev?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This Tweet is unavailable.
- Show replies
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
