In your opinion, what's a feature that Keras is missing, or otherwise something that Keras does poorly at this time? Be extremely specific (precise workflow, API signature). Thanks!
-
Show this thread
-
Replying to @fchollet
Being able to use different learning rates for different layers w/o gradient tape. (I like PyTorch’s implementation of this feature)
2 replies 0 retweets 10 likes
Replying to @riels89
You should be able to do this straightforwardly in a custom train_step, just use one optimizer per layer, iterate over the layers and do e.g. `opt[i]apply_gradients(zip(grads[slice_i], layers[i].trainable_weights))`
7:19 PM - 1 Jan 2021
0 replies
0 retweets
34 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.