Dynamically built neural networks is more idiomatic in Pytorch that in Tensorflow (even with eager) but is too often about try and fail... In 5years, I hope we'll have the same level of maturity in DL as what we have in backends with FP/typed languages...
-
-
It is clearly despite data-scientists don't really to hear about it :D But it's not for tomorrow and google chose to push Swift in TF... let's hope we'll be able to reuse their work on autodiff at compile-time in other contexts...
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.