So: @Bill_P_ and @ejagnes finally finished: “Context-modular memory networks support high-capacity, flexible, and robust associative memories.” https://www.biorxiv.org/content/10.1101/2020.01.08.898528v1 …
It's a bit far from our 'usual' rate and spiking nets. How did we get here?
-
-
Prikaži ovu nit
-
Well, based on earlier work from the lab, we were looking to include modulation, or ‘contextualisation’ into a framework of memory. We wondered if we could improve ‘memory performance’, by adding network state, like we have in Stroud et. al. 2018https://twitter.com/TPVogels/status/1067206517181595648?s=20 …
Prikaži ovu nit -
We started with Hopfield’s classic paper, https://www.pnas.org/content/79/8/2554 …. Spoiler alert: With just a few added bells and whistles, the model exhibits big improvements in memory capacity compared to the oldies(like from
@HSompolinsky’s ~0.14 to approximately 6.0 in the best cases).Prikaži ovu nit -
We can also enhance robustness to noise and continual learning. Perhaps most interestingly, the model allows for flexible gating of memories, which might just be a crucial aspect of memory systems in the brain.pic.twitter.com/Cb9bNjVOzj
Prikaži ovu nit -
We’ve known about the context-dependence of memory in the brain for a long time, both from behavioral experiments, and from studies on the hippocampus, PFC, and other areas. But most classic associative memory models do not incorporate context-dependence.
Prikaži ovu nit -
In the classic models (like the Hopfield network), memory patterns (sets of active neurons) are stored as attractors through Hebbian learning. The maximum number of stable patterns usually scales with the network size (Capacity = max number of stable memories/number of neurons).pic.twitter.com/AUEMmNvLvg
Prikaži ovu nit -
Se we 'evolved' Hopfield into a ‘context-dependent auto-associative memory’ network in which groups of memories are assigned to a context, with only one context active at any time. For now, 'context' is represented in an external network which can gate the memory network.pic.twitter.com/geKOvo6kAH
Prikaži ovu nit -
We use 2 types of modulation: neuron-specific gating, in which subsets of NEURONS are inactivated, and (targeted) synapse-specific gating, in which specific individual SYNAPSES can be gated off. Both of them create little ‘ad hoc’ networks, i.e., unique subsets of the whole net.pic.twitter.com/RtcmsYNTEz
Prikaži ovu nit -
-
Well, using only neuron-specific gating, memory capacity rises to nearly 1.2 (for 200 contexts), almost an order of magnitude higher than the standard Hopfield network (green arrow head). This depends on the number of subnetworks (s) and their relative size (a).pic.twitter.com/5eTmz3aGkJ
Prikaži ovu nit -
For TaSS gating, we have to jump through a few hoops, to find the synapses that have a net harmful effect for recalling a given set of memories in the active context, and shut those synapses UP.pic.twitter.com/BoYjwsukVv
Prikaži ovu nit -
TaSS protects memories from interference from irrelevant memories, allowing for a capacity of up to two orders of magnitude higher than the standard Hopfield model (green arrow head). Dang!pic.twitter.com/VxV8UgPJOn
Prikaži ovu nit -
Both types of contextual control require extra neurons, of course! So we calculate how many extra neurons are needed to encode context for each scheme, & determine if the memory gains we see are “real” and feasible. Check the paper for details, but we still come out on top. Phew!pic.twitter.com/fr7rnbltzN
Prikaži ovu nit -
It should be said, TaSS is a real pain in the (t)ASS to get ‘right’, so how would neurons do it?pic.twitter.com/ElZTDB0ziR
Prikaži ovu nit -
That’s the gist of it. But wait, there is more: Context also controls the memory stability of patterns, enabling dynamic control of memory accessibility. We show this, and suggest it may explain some memory disorders in which memories are not forgotten, but become inaccessible.pic.twitter.com/jzsRuFTUQV
Prikaži ovu nit -
And then we also looked at how our model can aid in continual learning, can implement arbitrary and overlapping contexts, and can encode distributions over memory strengths, to enable flexible memory storage and recall. In short, context makes everything better. Check it out!pic.twitter.com/LuIBR4sL3n
Prikaži ovu nit -
So, in summary our model shows: (i) how gating and inhibition can contribute to memory, (ii) how memory access/availability dynamically change over time, and (iii) how context representations, such as those observed in hippocampus and PFC, may interact with and control memory.pic.twitter.com/TZXLSFlQct
Prikaži ovu nit -
Obvs there is other work out there, like context-dependent gating in deep learning by Masse and colleagues, https://www.pnas.org/content/115/44/E10467.short …. We’re also anxiously waiting for
@BOlveczky’s new stuff, and@sjo09,@ratecoding,@Franklandlab (among others) have related work. Stay tuned!Prikaži ovu nit -
In any case, we hope to provide a framework for the cognitive control of memory, efficient memory search, and its interaction with decision making. The end. Check more work http://vogelslab.org Thanks for flying Neurovogels, we appreciate your business. Please come back soon.
Prikaži ovu nit
Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.