Math is hard--for neural networks. Particularly multiplication. New benchmarks, problematic performance. = the perils of trying to learn things that might more sensibly be built in. https://arxiv.org/abs/1910.01888
-
-
Replying to @GaryMarcus
anecdata - I once trained char-rnn on 100s of 1000s of autogenerated parody postmodernist articles http://www.elsewhere.org/journal/pomo/ it was pretty good and even numbered the footnotes but it never got the idea that the footnotes needed to be ordered...
1 reply 0 retweets 2 likes
Replying to @danbri
best phrase i ever accidentally invented: self-modern post-parody.
11:13 AM - 11 Oct 2019
0 replies
0 retweets
3 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.