Not a neuroscientist, but my own opinion is that weighted gating, compositionality, and learning are the foundations of bio neural nets, and all other phenomena that we care about are built on these blocks. Traditional computers only do the first, but DL can do all three.
-
-
-
say what? deep learning is very weak at compositionality; classical AI did they well.
- 3 more replies
New conversation -
-
-
I'm gonna hijack this to give an example of a good ANN model of a brain learning.https://www.nature.com/articles/nature15741 …
-
from the abstract it actually sounds plausibly symbolic to me, somewhat like this related work of my own:https://www.nature.com/articles/nature12173 …
- 2 more replies
New conversation -
-
-
Sounds like a discussion my neuroscience student
@Reham_Badawy1 might be able to contribute to ... -
Where do I begin?! Anatomically speaking: (1) Biological neural circuits (BNC) consist of a plethora of cells types, including pyramidal & purkinje cells, each comprising of a unique anatomical structure that enhances the electro-chemical process at both the input and outputs
- 5 more replies
New conversation -
-
-
Neural networks are not perfect, good or even adequate models of the brain. However, under some circumstances, neurons and ANNs are able to approximate similar classes of functions, we can also recreate neurons. The big unknown are the functional principles of brain organization.
- 1 more reply
New conversation -
-
-
"DNNs are important not because they capture many biological features, but because they provide a minimal functioning starting point for exploring what biological details matter to brain computation."http://oxfordre.com/neuroscience/view/10.1093/acrefore/9780190264086.001.0001/acrefore-9780190264086-e-46 …
-
that's an assumption, not a rigorously-argued-for empirical conclusion
- 2 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.