Badly-formed LSTNietzsche: "preacious of self-mentifed to the god that it no love and so morality and souls" -A few more epochs necessary...
@quantombone 2-3 hours on a EC2 GPU. You can tell when the training has converged simply by monitoring the training loss against test data..
-
-
@fchollet just curious, but what do you think that estimate would translate to in CPU hours? -
@quantombone Maybe 4-5x more in this case? It's hard to tell... - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.