The similarity between writing & training a neural net is striking. Start with random weights (AKA a bad first draft), and apply gradient descent until it plateaus / you can't bear it any more...
-
-
I see the similarity too. But also, the more I understand neural networks, the less I need iterations. The gradient descent seems to move more and more to the brain ;-)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.