There's a major difference: DL was guided in its infancy by ideas from neuroscience, so there is a relatively direct link between them. In contrast, the application of quantum mechanics to the c-word is taking two distinct fields and tying them together on speculative grounds.
-
-
Replying to @tyrell_turing @neuro_data and
The relation between deep learning - with its single neuron type and largely homogenous architecture - and the actual complexity of the human brain, with > 1000 neuron types, hundreds of proteins at each synapse and > 100 distinct brain regions - is risible.
2 replies 1 retweet 36 likes -
Replying to @GaryMarcus @neuro_data and
Every model is an abstraction. Newtonian mechanics ignores air turbulence, molecular interactions, etc. Climate models capture coarse grained interactions, not the multitude of animals, plants, and wind-currents that truly shape the climate. Neural networks are no different.
6 replies 4 retweets 35 likes -
Replying to @tyrell_turing @neuro_data and
Let's be real. Current neural nets have been shown empirically to work on some problems (after tinkering to get details right) - but do we really *know* that they are an abstraction of the brain, in which their details map onto simplifications of actual brain processes? No.
3 replies 6 retweets 55 likes -
Replying to @GaryMarcus @neuro_data and
I'm sorry, but this is a bad take. Yes, we know they are simplifications of real brains. 1) Neurons do something very similar to linear integration with a non-linearity. 2) They process inputs in a distributed, parallel manner. ANNs capture this basic process, period.
5 replies 2 retweets 33 likes -
Replying to @tyrell_turing @GaryMarcus and
I am amazed at how many ppl resist this basic fact! Folks: you may believe ANNs miss critical biological details, cool, that's a legit position to take. But why pretend that ANNs are not an abstraction of neural processing? That is simply not a tenable position, frankly.
15 replies 3 retweets 64 likes -
Replying to @tyrell_turing @neuro_data and
started a new thread because my jaw dropped at the hubris of this remark.
3 replies 0 retweets 12 likes -
Replying to @GaryMarcus @neuro_data and
It's not hubris, man. You may not think ANNs are a *good* model. But, neurons are well approximated in their firing rate by a linear-non-linear model. See here: https://www.sciencedirect.com/science/article/pii/S0896627318307372 … Given this, it is ridiculous to claim that ANNs are not an abstraction of neural processing.
7 replies 1 retweet 22 likes -
Replying to @tyrell_turing @neuro_data and
mere approximation doesn’t mean you have really captured what that component is, let alone how system as a whole works. NOBODY really understands how you get from neural nonlinearities to cognition without clear account of that, rest is straw-grasping
3 replies 0 retweets 7 likes -
Replying to @GaryMarcus @neuro_data and
You're trying to have a debate with me that I'm not interested in, bc I am not taking the position you think I am. I am not claiming that I can say with certainty that ANNs are a *good* model of the brain. I am claiming that they capture *some* aspects of neural processing.
4 replies 0 retweets 20 likes
it may turn out that we are fundamentally wrong in our early 21st century thinking about the brain, and eg that most of the action is at the dendrites and that we have misunderstood what neurons (and other constituents of the brain) do. see comment re in vitro vs in vivo
-
-
Replying to @GaryMarcus @tyrell_turing and
But I agree that we may massively underestimate the computational power of dendrites.
@IlennaJ and@aha_momentum in my lab are going crazy about that idea ;)2 replies 0 retweets 8 likes -
Replying to @KordingLab @GaryMarcus and
@GaryMarcus I think you're not understanding@tyrell_turing . I saw another post today that said "all neurology is inferior to ANN models," and THAT point of view is genuinely worth ridiculing.1 reply 0 retweets 2 likes - 7 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.