They were my first introduction to any machine learning techniques, totally on a whim. They're mind-expanding in the way that LISP or calculus are!https://twitter.com/kareem_carr/status/1104422318019887105 …
-
-
Also, neural nets enforce sparseness. Each neuron is a sub-solution (some combination of neurons/sub-solutions) from the prior layer and since there is a bound on neurons in the layer, there is a bound on the number of sub-solutions. Less so for genetic algos.
-
But it's somehow less fun! (I want to replicate this for pleasure sometime:) https://arxiv.org/pdf/1712.06567.pdf …
- 3 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.