2. One reason to have so many neurons may be that they each have different jobs: Neuron A recognizes the pointedness of a fox’s ears, Neuron B recognizes the color of the fox’s fur. Neuron C recognizes a fox nose, etcpic.twitter.com/1t5hOn1WKn
You can add location information to your Tweets, such as your city or precise location, from the web and via third-party applications. You always have the option to delete your Tweet location history. Learn more
2. One reason to have so many neurons may be that they each have different jobs: Neuron A recognizes the pointedness of a fox’s ears, Neuron B recognizes the color of the fox’s fur. Neuron C recognizes a fox nose, etcpic.twitter.com/1t5hOn1WKn
3. When enough of these neurons activate, the brain as a whole can recognize a fox.pic.twitter.com/ZVk3zPj0YS
4. What if some neurons “fall asleep” on the job and don’t respond to the image? This actually happens very often, and yet the brain is remarkably robust to these failures.
5. Even if 90% of the neurons don’t do their job, we can still recognize the fox. Even if we randomly change 90% of the pixels, we can still recognize the fox. The brain is robust to a lot of manipulations like that.pic.twitter.com/7oi9bOm9UL
6. Artificial neural networks also use millions of neurons to recognize images.pic.twitter.com/sOY5Emm3Qa
7. Unlike brains, machines are not so robust to small aberrations. Here is our fox and next to it the same fox very slightly modified and now the machine thinks it’s a puffer fish!pic.twitter.com/Bd9ZDyY7QJ
8. These are called “adversarial images”, because we devised them to fool the machine. How does the brain protect against these perturbations and others?
9. One protection could be to make many slightly different copies of the neurons that represent foxes. Even if some neurons fall asleep on the job, their copies might still activate.
10. However, if the brain used so many neurons for every single image, we would quickly run out of neurons!
11. This results in an evolutionary pressure: it’s good to have many neurons do very different jobs so we can recognize lots of objects in images, but it’s also good if they share some responsibilities, so they can pick up the slack when necessary.
12. We found evidence for this by investigating the main dimensions of variation in the responses of 10,000 neurons. Below, each column is one neuron’s responses to several of our images.pic.twitter.com/UZqDjTaKv4
13. The largest two dimensions were distributed broadly across all neurons, as you see below. Any neuron could contribute to these and pick up the slack if the other neurons did not respond.pic.twitter.com/AtES0KSNR1
14. The next 8 dimensions each were smaller and distributed more sparsely across neurons. If a neuron was asleep, it was still likely a few others could represent these dimensions in its place.pic.twitter.com/7Dqnes6lUQ
15. The next 30 dimensions revealed ever more intricate structure...pic.twitter.com/v8dZtUJgbp
17. And so on, this kept on going, with the N-th dimension being about N times smaller than the biggest dimension.
18. This distribution of activity is called a “power-law”.pic.twitter.com/IiWTda0nwI
19. However, this was not just any power-law, it had a special exponent of approx 1. We did some math and showed that a power-law with this exponent must be borderline fractal.
20. A fractal is a mathematical object that has structure at many different spatial scales, like the Mandelbrot set below:pic.twitter.com/WTARMWiym4
22. The neural activity was so close to being a fractal, and just barely avoided it because it’s exponent was 1.04, not 1 or smaller.
23. An exponent of 1.04 is the sweet spot: as high-dimensional as possible without being a fractal.
24. Not being a fractal allows neural responses to be continuous and smooth, which are the minimal protections neurons need so that we don’t confuse a fox with a puffer fish!
All the neural data is available here: https://figshare.com/articles/Recordings_of_ten_thousand_neurons_in_visual_cortex_in_response_to_2_800_natural_images/6845348 … And the code is here:https://github.com/MouseLand/stringer-pachitariu-et-al-2018b …
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.