For the record, RNNs are Turing-complete and can implement any function: https://stats.stackexchange.com/questions/220907/meaning-and-proof-of-rnn-can-approximate-any-algorithm …https://twitter.com/fchollet/status/1010278944782708737 …
-
-
These claims, although technically true, are profoundly misleading. Brainfuck is a better Turing-complete substrate for automatic programming than a Conv-LSTM (note than a vanilla RNN is not Turing-complete, btw).
-
(Although, it would be pretty shocking if you could *not* come up with simple NN architectures that were Turing-complete -- Turing completeness is an extremely common property for any non-trivial data processing system.)
- Show replies
New conversation -
-
-
to approximate temporal information that can inherenntly taken to account via RNNs, by large scale convolutional networks. Is this smart??
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.