"Estimating the dimensionality of neural data brings its own unique challenges. [D]imensionality can be defined as the number of linear orthogonal components (singular- or eigenvalues) underlying a matrix that are larger than zero (Shlens, 2014)"https://doi.org/10.1101/232454
-
-
Yeah, no magic solutions, but one can find best model, which can be illuminating. E.g., dimensionality difference dependent on task.
-
Completely agree! I think there's probably some interesting work on dimensionality collapse prior to critical transitions, as well.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
