As a result I will no longer publish my deep learning frameworks overview on a monthly basis. I will switch to a quarterly basis.
-
-
-
This month, I will simply share aggregated GitHub activity over the last month. It is similar to what it was in August-September.pic.twitter.com/NqVMCzQo6m
- Show replies
New conversation -
-
-
what do you think caused this?
-
framework fatigue. And the fact that TensorFlow / Keras are good enough for most people.
End of conversation
New conversation -
-
-
Agreed. But looking at reproducible results between varying CUDA, cuDNN, OpenCL, etc. versions has me more concerned.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
In *theory* that can all be documented on an experiment-by-experiment basis though.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Was it the release of tensorflow that kicked off the changes?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
My experience is while the framework landscape has indeed settled, the APIs of said frameworks are still changing rapidly.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
at least 10 great options to choose from all doing "linear algebra". now its a function of what language and "production options".
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
it is relatively stable compared to earlier this year. but paddle is new, so that may change more quickly than the rest.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.