Lots of great answers in this thread -- thanks a lot to everyone who replied!
-
-
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Checkpoint hacking: We had a trouble trying to do the inflation trick (converting 2d to 3d) and that ended up being the main factor pushing the project to Pytorch. I think internally at Google there is something called graphter?
-
Can you be more specific?
End of conversation
New conversation -
-
-
I find it really hard to write graph convolution layers in keras, so much so that we've developed our own in-house library for it
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
well, super nitpicky as I actually use tf Datasets at work, but I feel like the Torch Dataloaders paradigm (as seen on https://pytorch.org/docs/stable/data.html …) is the one area where I prefer using Torch.
-
Have you tried use the Sequence class?https://keras.io/api/utils/python_utils/#sequence-class …
- Show replies
New conversation -
-
-
Being able to use different learning rates for different layers w/o gradient tape. (I like PyTorch’s implementation of this feature)
-
You should be able to do this straightforwardly in a custom train_step, just use one optimizer per layer, iterate over the layers and do e.g. `opt[i]apply_gradients(zip(grads[slice_i], layers[i].trainable_weights))`
End of conversation
New conversation -
-
-
Simple quantisation flow AFTER training, in order to infer on the latest Tensor Core GPUs, but without moving to TFLite.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It would be great to see use case examples for functional api. its hard to find examples and understand which kind of problems functional api addresses best. Thanks!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.