The most remarkable thing about the brain is that it wasn't designed. As a result, how the brain works is largely an irrelevant question.
-
-
Would you consider that inspecting the weights values of one convnet is a good way to understand how to train new models on new problems?
-
What matters in this case is the training data, hypothesis space, objective, and optimization algorithm. Weight values are a distraction.
- Show replies
New conversation -
-
-
Evolution didn't take full advantage of gradients, and it certainly didn't have access to 10,000 NVIDIA GPUs.
-
Evolution cares about energy! GPUs antithetical.Humans think imprecision just -> less energy but is also, noise relaxes free energy demands.
End of conversation
New conversation -
-
-
A few decades? Sure, if you ignore all those centuries of basic math and physics research, not to mention other fields.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Evolution dint come up modern CPUs cause energy constraints. Not just obvious imprecise components,also exploit noise tw free energy stores
-
would exploit noisy dynamics purposely as way to maintain higher free energy budget or reduce free energy requirements. (rev of bit erase)
End of conversation
New conversation -
-
-
Evolution (of the brain) wouldn't always favor optimization, or would it?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.