For a sufficiently large neural network with random weights we technically don't have to change any of the weights, just the connections(i.e. topology). This idea, at the intersection of functional analysis and random matrix theory, probably has consequences for neuroscience.
-
-
Odgovor korisniku/ci @bayesianbrain
There were quite a few papers on this topic, about doing architecture search instead of learning the weights. Eg https://arxiv.org/abs/1904.01569v2 … And neuroscience has an even longer history of exploring this idea: see the famous "Silent synapses" controversy of the 1990 - early 2000s.
1 reply 0 proslijeđenih tweetova 2 korisnika označavaju da im se sviđa -
Odgovor korisniku/ci @ampanmdagaba
Nice. The idea is actually quite simple. A large network with random weights will contain every subnetwork you can imagine. But, if you want learning algorithms to take advantage of this you need good mathematical theory.
0 proslijeđenih tweetova 0 korisnika označava da im se sviđa -
Odgovor korisnicima @bayesianbrain @ampanmdagaba
I am ignorant of such a theory. Does it exist?
1 reply 0 proslijeđenih tweetova 0 korisnika označava da im se sviđa -
Odgovor korisnicima @bayesianbrain @ampanmdagaba
I can't find anything holistic, but these seem to be special cases of what you're referring to https://weightagnostic.github.io/ https://openreview.net/pdf?id=rJl-b3RcF7 …
1 reply 0 proslijeđenih tweetova 1 korisnik označava da mu se sviđa -
Odgovor korisnicima @colejhudson @bayesianbrain
Right, the lottery ticket is also related. But slightly different. The ticket is about a trainable network, While here we have a network that is so good that it works on any weights between 0 and 1.
0 proslijeđenih tweetova 2 korisnika označavaju da im se sviđa -
But for a (former) neuroscientist, this all feels a bit silly, because silent synapses are so prevalent in the brain. From EM studies we know that for every functional synapse every neurons has 10-100 times more almost-synapses that sit nearby, but don't operate. Ready, waiting.
1 reply 0 proslijeđenih tweetova 2 korisnika označavaju da im se sviđa -
Odgovor korisnicima @ampanmdagaba @bayesianbrain
looking for something to read on silent synapses, whats your fave?
1 reply 0 proslijeđenih tweetova 0 korisnika označava da im se sviđa -
Odgovor korisnicima @colejhudson @bayesianbrain
I think you could start with Wiki or Scholarpedia: http://www.scholarpedia.org/article/Silent_synapse … The main controversy was about the mechanisms: the post-synaptic team (Malinow, Malenka) and the older pre-synaptic ones (Voronin, Cherubini, to some degree Kullmann).
1 reply 0 proslijeđenih tweetova 1 korisnik označava da mu se sviđa
The story definitely belongs to the list of top 10 historical controversies in Neuroscience: https://wiki.brown.edu/confluence/display/BN0193S04/Syllabus … Right after neuronal doctrine, chemical synapses, and active dendrites :)
-
-
Odgovor korisnicima @ampanmdagaba @bayesianbrain
that course is sickkk, thanks for the rec!!
0 replies 0 proslijeđenih tweetova 1 korisnik označava da mu se sviđaHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.