AI doesn’t replicate. Having worked in the field, I can usually see why a paper’s result is nonsense, but the public can’t, and many researchers can’t.https://twitter.com/stephaniemlee/status/964612382650646529 …
-
-
This turned out not to be true in any interesting sense. You can compute XOR with a feedforward network, but backprop won’t learn it reliably, nor in a reasonable length of time. It has to get lucky. 2/
-
My recollection is that other people figured this out a few years later and that mostly killed off backprop research until ~2012. My memory of the details is vague however. There were a few others, but XOR and RL were the ones that seemed most significant. 3/3
- 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.