2/6 Learning dynamics is key to understanding neural computation, but they can be too complex to understand. Can we view it at different levels of granularity? E.g. Lorenz system approximated with 2 linear dynamical systems?pic.twitter.com/UmD3IhS5Gm
-
-
Show this thread
-
3/6 Or with 4 linear dynamical systems (LDS)? Both of these are from a single model.pic.twitter.com/kX4xFLZQye
Show this thread -
4/6 We propose a generative model, TrSLDS (Tree-structured recurrent switching LDS), extending
@scott_linderman’s rSLDS (AISTATS 2017) with fully Bayesian sampling using Polya-Gamma augmentation and tree-structured stick breaking.pic.twitter.com/QIfPrC0ac6
Show this thread -
5/6 Just from the spike trains, we can recover decision making dynamics from the XJ Wang’s 2002 spiking neural network model. (in another paper with
@yuanz271 http://catniplab.github.io/publications/Nassar2018a.pdf …)pic.twitter.com/OAAcFhQDyl
Show this thread -
6/6 It can also recover limit cycles from population V1 spike trains! Looking forward to applying to more neural data!pic.twitter.com/LVGFOy63Nx
Show this thread
End of conversation
New conversation -
-
-
Does your method beat LFADS?
-
Good Q. We will compare soon!
End of conversation
New conversation -
-
-
Amazing work! May I know if code is going to be made available anytime soon?
-
We'll make it public when it's accepted. (or maybe sooner!)
End of conversation
New conversation -
-
-
Just started flipping through this. So far it looks amazing!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This looks awesome. Congrats !!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.