If you have implemented a model with a custom `train_step` method, this feature is automatically available for your custom training loop. This is one of a few advantages of implementing custom training loops via `train_step`...
-
-
Show this thread
-
Note that batch-level callbacks will be run in between graph executions (e.g. model checkpointing, progress bar updates...)
Show this thread -
To be clear -- you don't need to use `steps_per_execution` every time you use a TPU (or GPU). It depends on your model. For example, a model like ResNet50 with batch size 256 would already be near full utilization and wouldn't benefit from it.
Show this thread -
But a small dense model would have very low utilization, and in this case, you can use this argument to get to 100% utilization without changing your batch size. The speed up you get is inversely proportional to the original utilization -- could be 1.5x or 50x.
Show this thread
End of conversation
New conversation -
-
-
This Tweet is unavailable.
-
I believe Google doesn't allow training on distributed TPUs at the moment. You essentially train on a slice of cores (v32 - 32 cores) from the TPU single rack (may be 2048 cores)
End of conversation
-
-
-
I don't understand. I've never used Keras on TPUs, but surely people would have complained that Keras was slow unless it was already doing this, right? By "single graph execution", do you mean "a single call to session .run"? (I know TF2 doesn't use session. But, conceptually.)
-
The performance difference is massive. *Of course* multiple steps need to be done in a single session .run call. (In fairness, I underestimated just how massive the performance difference was. But it's like 40x.) Surprised Keras might've made this same mistake.
End of conversation
New conversation -
-
-
Wouldn't it make sense to just make it a default of `compile` to auto-detect if it's running on a TPU machine?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I greatly appreciate your tweets with useful knowledge about Keras and AI in general I like to learn while I procrastinate on Twitter and your tweets are among the best ones We all need more tweets about the things we can do, and less from angry peoplehttps://twitter.com/trylks/status/1286676305744998401 …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.