This is at least somewhat like what I was looking for: https://www.sciencedirect.com/science/article/pii/S0896627317305093# …!
-
-
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Look into convolutional neural nets. They are "inspired" by the layered organization of the visual cortex.
-
I'm aware of that. But a design inspiration is not the same thing as a set of design lessons.
-
At the very least we could say it's a single design lesson: that hierarchical layers of processing allow for the learning of abstractions at multiple levels.
End of conversation
New conversation -
-
-
Investigation of the architecture of visual cortex has played a large role in the development of deep learning and convolutional neural networks. So far, most AI research therefore mostly has analogies to the functionality of local cortical circuits.
-
Many people claim their designs are inspired in part by brains. That's different from general lessons we can draw.
-
The visual cortex inspiration case is fairly well documented. David Marr did some very influential theoretical neuroscience papers where he proposed what computations the anatomy implied, and then later moved to computer vision where he became influential.
-
It is quite easy to make the case that eg LeCun's vision solutions were generalizations from audio processing to image processing. In practice, we can often use AI models to interpret neuroscience, but we can rarely use neuroscience to predict how to build an AI model.
-
I thought historically LeCun was inspired by Fukishima's neocognitron, but could be wrong. I agree in general that it is difficult to go from brain science to AI, but am very interested when this does happen.
-
I have in the past tried to count the cases of neuro/cogno to AI inspiration. There is certainly a decent list, yet it is not *that* long + the delays are long. Yet I suspect the problem is not AI not listening but that we in neuro are rather bad at coming up with good ideas.
End of conversation
New conversation -
-
-
fwiw, For talks see the two Tooby talks at the Singularity Summits. https://intelligence.org/singularitysummit/ … .. (Modularity and the difficulty of general intelligence and the evolvability of domain-specific intelligence.) IIRC
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I see some people already mentioned Jeff Hawkins. Here's some links https://spectrum.ieee.org/computing/software/what-intelligent-machines-need-to-learn-from-the-neocortex …https://numenta.com/blog/2018/01/08/navigating-numenta-through-progression-of-papers/ …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
https://www.amazon.com/Cambrian-Intelligence-Early-History-New/dp/0262522632?SubscriptionId=AKIAILSHYYTFIVPWUY6Q&tag=duckduckgo-iphone-20&linkCode=xm2&camp=2025&creative=165953&creativeASIN=0262522632 … https://www.amazon.com/Cambrian-Intelligence-Early-History-New/dp/0262522632?SubscriptionId=AKIAILSHYYTFIVPWUY6Q&tag=duckduckgo-iphone-20&linkCode=xm2&camp=2025&creative=165953&creativeASIN=0262522632 …
@rodneyabrooks
End of conversation
New conversation -
-
Jeff Hawkins, "On Intelligence".
- 1 more reply
New conversation -
-
-
You might find this hippocampus grid cell inspired deepmind paper interesting:https://www.nature.com/articles/s41586-018-0102-6 …
-
This might be a better general introduction:https://deepmind.com/blog/grid-cells/ …
End of conversation
New conversation -
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Not a general "lessons learned" article, but this one discusses the way that the human brain executes cost functions and how that might apply to machine learninghttps://www.frontiersin.org/articles/10.3389/fncom.2016.00094/full …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.