ML people: in the nn-nlp primer I say that the loss function is always positive or zero, and a reviewer questions this assertion. >
@yoavgo makes more sense to take the opposite of the quantity than change the sign of the gradient updates
5:30 PM - 19 Mar 2016
0 replies
0 retweets
2 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.