Why do you expect everyone to transition to TPUs?
-
-
-
They are markedly faster and can handle larger workloads. This enables you to iterate faster on deep-learning based products. You could compare it to the shift from CPU clusters to GPUs 5 years ago.
End of conversation
New conversation -
-
-
That is a bad news for NVIDIA.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Any support for multi-TPU?
-
When I tried on Colab just a bit ago it said I had 8 TPU cores available.
- Show replies
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
ah, so it needs additional codes to run on TPU? no wonder I get no improvement in speed when I tried earlier, (I just changed the runtime type)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Shouldn't it just be transparent to the end user ? At the end all we care about is the tradeoff between wall time to useful product vs cost. Who cares if its on CPU/GPU/TPU, we just want it fast and cheap.
-
I think the code should be less verbose
End of conversation
New conversation -
-
-
@Mitra_Abhish
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
