This work is super important: the weight transport problem starts looking easy. Maybe backprop is not hard to evolve after all. Great work by @guerguiev and @tyrell_turinghttps://twitter.com/tyrell_turing/status/1183149425360822272 …
-
-
I have a question. Could you try to frame this for me reaaaaal specific? Just trying to understand how you think about it. Say your a layer 5 pyramidal cell in motor cortex. You integrate your L2/3 (feedforward-ish) and L1 (feedbackish) input, and spike, transmitting signals /1
1 reply 0 retweets 4 likes -
Replying to @neurosutras @KordingLab and
downstream. A behavioral consequence, after a delay, is either positive or negative (and likely graded). What synapses from where carry the error signal? And those same exact sources receive inputs from our cell with identical weights? You provide a method for how. But /2
1 reply 0 retweets 3 likes -
Replying to @neurosutras @KordingLab and
is there evidence of this configuration in any biological neural circuit? /3
3 replies 0 retweets 4 likes -
These are great Qs! We haven't tried to answer them in this paper, and I suspect reality is messy. But, I think there are a few different possibilities data point to, including 1) assoc thalamic signals send the error signals, 2) error L2/3 compute error signals, send to L5.
2 replies 0 retweets 2 likes -
I celebrate these demonstrations that gradient descent can avoid weight transport by using local rules. However, I am skeptical of the requirement for fine reciprocal connectivity, let alone symmetric weights at those connections. In your example, most L5 cells are not /1
1 reply 0 retweets 1 like -
Replying to @neurosutras @tyrell_turing and
reciprocally connected to their sources in thalamus or L2/3. Error may come back many synapses removed, so we should figure out how it can work without this nonphysiological stipulation. /2
2 replies 0 retweets 0 likes -
-
Replying to @AdamMarblestone @tyrell_turing and
I need to read that more closely. I think it's conclusions are about coarse projections, not fine reciprocal connectivity.
1 reply 0 retweets 0 likes
From a quick skim I was seeing it as fine reciprocal...
-
-
Replying to @AdamMarblestone @neurosutras and
(OK not exactly cell by cell reciprocal, but single cells tending to synapse on those that are area-wise reciprocal.)
1 reply 0 retweets 3 likes -
Replying to @AdamMarblestone @neurosutras and
which may suggest that we need meta-learning, that cells need to learn how to interpret top-down signals. Which is exactly what the papers by
@benlansdell ,@guerguiev and@tyrell_turing are about.0 replies 0 retweets 2 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.