Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @niru_m
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @niru_m
-
Niru Maheswaranathan proslijedio/la je Tweet
Peering into a deep network trained on retina data: - Instantaneous RFs are context-dependent and state-based - Network subspace used by white noise and natural scenes are different [Nice update from
@niru_m@SuryaGanguli] https://www.biorxiv.org/content/10.1101/340943v5 …pic.twitter.com/D6vIuvMP5X
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
Co-presenting this work with
@niru_m,@MattGolub_Neuro,@SuryaGanguli,@SussilloDavid now#NeurIPS at poster #156!https://twitter.com/niru_m/status/1144313901254758400 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Awesome work, congrats
@dnag09
https://twitter.com/NEJM/status/1194737955081871360 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
New paper out on
#NeurIPS2019: “From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction” with fantastic collaborators@aran_nayebi,@niru_m,@lmcintosh, Stephen Baccus,@SuryaGanguli. https://papers.nips.cc/paper/9060-from-deep-learning-to-mechanistic-understanding-in-neuroscience-the-structure-of-retinal-prediction …pic.twitter.com/FvLyTmbZtY
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
Here you can see two neurons sensing one another and connecting in a petri dish. There are 86 billion neurons in the
#brain, and they use these webbed hand like structures (“growth cones”) to search for and connect to other neurons or body parts as we develop@AcademicChatterpic.twitter.com/TOFLqUThhoPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
We approximated the implicit function theorem to tune millions of hyperparameters. Now we can train data augmentation networks from scratch using gradients from the validation loss. https://arxiv.org/pdf/1911.02590.pdf … With
@JonLorraine and@PaulVicolpic.twitter.com/BUVS4JSWPP
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
So grateful for
@skornblith 's colab on CKA. It seems obvious in retrospect but I hadn't considered the equivalence of calculating similarities based on examples and based on features. My experiments are so much faster now...
https://twitter.com/skornblith/status/1138859179165093888 …pic.twitter.com/SJXWjXiLUf
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Loved listening to this interview with Andrew Saxe. Great summaries of a lot of beautiful work! (side note, the previous interviews are just as good--kudos to
@pgmid for running a great podcast!)https://twitter.com/pgmid/status/1192098259734384646 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
No, this isn’t from
@tyrell_turing et al recent perspective on@NatureNeuro. This is David Robinson trying to make a similar point in 1992: http://www.dna.caltech.edu/courses/cns187/references/Robinson-92.pdf …pic.twitter.com/bXJANFMtO9
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#cosyne2020 submission deadline: 31Oct 11:59 **Pacific Time**.#cosyne2020 registration deadline: 31Jan 11:59 **Eastern Time**. (http://www.cosyne.org/c/index.php?title=Abstracts …)pic.twitter.com/taTBBAiY3xHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
itermplot (https://github.com/daleroberts/itermplot …) is a matplotlib backend that displays directly in your terminal (iterm2). Really awesome resource if you (like me) enjoy working directly from the IPython repl!pic.twitter.com/oG0znEelVl
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
We received ~650 abstracts for
#cosyne2020, a number comparable to two years ago in Denver (700), and a big drop wrt Lisbon last year (1000).Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
Stoked to share a milestone project for all of us!
#NeurIPS2019 paper with@akshaykagrawal,@ShaneBarratt, S. Boyd, S. Diamond,@zicokolter: Differentiable Convex Optimization Layers Paper: http://web.stanford.edu/~boyd/papers/pdf/diff_cvxpy.pdf … Blog Post: https://locuslab.github.io/2019-10-28-cvxpylayers/ … Repo: https://github.com/cvxgrp/cvxpylayers …https://twitter.com/akshaykagrawal/status/1188845518962585600 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
s/algorithms/developments. Matlab is not an algorithm
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The greatest numerical algorithms of the 20th century, according to Nick Trefethen circa 2005 (http://people.maths.ox.ac.uk/trefethen/inventorstalk.pdf …)pic.twitter.com/1okDhTC2Zk
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
"the geometry of the RNN representations can be .. sensitive to .. network architectures, yielding a cautionary tale for measures of similarity that rely representational geometry" YEEEEEEES replace RNN with any deep nets and still YES https://arxiv.org/abs/1907.08549
@SussilloDavidHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
#tweeprint Universality and individuality in neural dynamics across large populations of recurrent networks https://arxiv.org/abs/1907.08549 . With fantastic collaborators@niru_m,@ItsNeuronal,@MattGolub_Neuro,@SuryaGanguli.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Niru Maheswaranathan proslijedio/la je Tweet
So excited to be there in person to cheer on the
@USWNT in the World Cup finals! Inspired by all they do on the field, and even more so for their fight off the field for equal pay for female athletes. You go ladies!
https://twitter.com/alexmorgan13/status/1147159880718880769 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Finally, we can understand how the network processes individual words (tokens) by looking at projections of the embedding vectors onto the principal eigenvectors of the system. Overall, we think these tools will help us demystify and understand how recurrent networks work! (4/4)pic.twitter.com/BwVqB2n8SK
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The network dynamics are organized around a roughly 1-D approximate line attractor, which we identify by studying the eigendecomposition of the recurrent Jacobian of the dynamics at approximate fixed points. (3/4)pic.twitter.com/mFnfUWomwC
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.



(1/4)