Symbolic gradients (tf1) I like it when my for-loop does one thing at a time.
-
This Tweet is unavailable.
-
Replying to @by_niyi
It's very concise, but it does introduce a bit of mystery as to what is the computation being differentiated: you specify the end point of the computation (the loss) but you don't specify the starting point, which is implicit
10:31 AM - 6 Aug 2021
0 replies
0 retweets
6 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.