The generic form of a model is not a neural network, but a set of parameters (each a set of possible values) with arbitrary computational relationships between them. Much of machine learning may move from the current types of neural networks to computational graphs.
-
-
It's more like, forcing your program to be written in terms of matrix operations also forces you to write it in a well parallizable way. This is why general differentiable programs are fewer than those that work well with SIMD/SPMD. Even RNNs dont leverage as much so alternatives
-
Note that computation graphs are an incidental implementation detail and not fundamental. "Graph networks", where graph is a different type to computation graph's, still prefer the underlying representation be in terms of formulas with a natural matrix interpretation.
End of conversation
New conversation -
-
-
Genetic algorithms can train any model, but they are slow.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.