Or just jump directly to the Colab notebook:https://colab.research.google.com/drive/17u-pRZJnKN0gO5XZmq8n5A2bKGrfKEUg …
-
-
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
In today's Bengaluru Roadshow, we saw tf_text in action. Oh man! Loved it. And this is the reaction of a computer vision guy


-
Lol. You must be 'Eager' to execute it


.
End of conversation
New conversation -
-
-
Thanks for the crash-course! Is there any equivalent to PyTorch's "register_forward_pre_hook" and "register_backward_hook" for layers? For each layer in a model, I want to record layer output gradients and inputs during training.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
why not make a medium article based on that thread? id read it!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Hey Francois, do I need to use tf.function for tf.keras Models as TF 2 is eager by default?
-
If you're using `fit` or `predict`, no, models are compiled by default. If you're just `__call__`ing your model (manual training loop), then put a tf.function decorator on your training step function.
- Show replies
New conversation -
-
-
Hi! Francois, any chance you will update your deep learning book to TF 2.0?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Yes the response is great. Especially if one would like a functioning Bayesian Neural Network example in the new TF probability (based on TF 2.0.0), Google is very welcoming for someone "else" to send in a PR:https://github.com/tensorflow/probability/issues/438 …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.