Is "learning from humans" the new "layer-wise pre-training"?
-
-
True, instead of hard-coding high-level representations, evolution "decided" it was better to pre-wire us with something more powerful 1/3
-
An optimizer allowing the model to independently learn high-level representations. Or maybe the optimizer IS the model or vice-versa 2/3
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.