To be learnable, a problem has to be stable. Therefore in a stacked ML architecture, the lower levels should evolve at slower scale.
@stephenroller both. I use Torch7 for CNNs, Theano for low-dim problems, and currently numpy+pypy to write my own stuff (large sparse space)
-
-
@fchollet experiences with Theano? Sometimes I love it but i find debugging to be a nightmare -
@stephenroller@fchollet Debug takes time but automagic gradient is amazing, especially in strange architectures. Multilayer RNN...
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.