Niru Maheswaranathan

@niru_m

Trying to separate signal from the noise; research engineer at Google Brain. Formerly , , . ’s +1. Opinions my own. ⚽️☕️👨🏾‍💻

Mountain View, CA
Vrijeme pridruživanja: ožujak 2009.

Medijski sadržaj

  1. 1. stu 2019.

    submission deadline: 31Oct 11:59 **Pacific Time**. registration deadline: 31Jan 11:59 **Eastern Time**. ()

  2. 1. stu 2019.

    itermplot () is a matplotlib backend that displays directly in your terminal (iterm2). Really awesome resource if you (like me) enjoy working directly from the IPython repl!

  3. 9. ruj 2019.

    The greatest numerical algorithms of the 20th century, according to Nick Trefethen circa 2005 ()

    Prikaži ovu nit
  4. 27. lip 2019.

    Finally, we can understand how the network processes individual words (tokens) by looking at projections of the embedding vectors onto the principal eigenvectors of the system. Overall, we think these tools will help us demystify and understand how recurrent networks work! (4/4)

    Prikaži ovu nit
  5. 27. lip 2019.

    The network dynamics are organized around a roughly 1-D approximate line attractor, which we identify by studying the eigendecomposition of the recurrent Jacobian of the dynamics at approximate fixed points. (3/4)

    Prikaži ovu nit
  6. 27. lip 2019.

    We analyze recurrent networks trained to perform sentiment classification, a standard natural language processing (NLP) task. We find that RNNs trained on this task as surprisingly interpretable using tools from dynamical systems. (2/4)

    Prikaži ovu nit
  7. 13. lip 2019.

    Sneak preview of our poster tonight (#146) at ! (Read the full paper at )

  8. 2. ožu 2019.

    What are useful metrics for comparing representations in RNNs with the brain? How sensitive are these to architecture choices? Find out at our poster (III-51) tonight at ! Awesome collaboration with

  9. 8. stu 2018.

    Great overview of recent work analyzing optimization trajectories of deep networks. In particular, for deep linear networks, you can get linear convergence to global minima if the initial weights are 'aligned' (see post for details)

    Prikaži ovu nit
  10. 24. lis 2018.

    New work: Learning optimizers with less mind numbing pain! With improvements to stabilize meta-training, we find that we are able to train optimizers that beat well tuned baselines on wall-clock time. (1/2)

    Prikaži ovu nit
  11. 20. lis 2018.

    Next up in great talks: Ben Recht on connections between control theory and reinforcement learning. (ICML 2018 tutorial)

    Prikaži ovu nit
  12. 23. ruj 2018.

    If you’re into optimization, this talk from Mike Jordan is a really nice overview of a lot of interesting recent work:

  13. 3. ožu 2018.

    Check out our poster tonight at (III-8). We show that deep networks trained to reproduce retinal responses to natural scenes also exhibit a whole host of previously published phenomena. The same models trained on white noise surprisingly don't have this property! 🤯

    Prikaži ovu nit
  14. 1. sij 2018.

    The last figure from this paper () suggests that ~80-90% of the variance in parameter updates can be captured by just two dimensions--this seems insane to me. Is NN optimization inherently this low-dimensional?

  15. 5. stu 2017.

    I think people heavily underestimate the impact of international travel on carbon emissions—relevant for conferences

  16. 27. lis 2017.

    3D printed adversarial examples?! Seems to work from multiple angles. (Paper: )

  17. 9. ruj 2017.

    "Definite optimism as human capital" by -- insightful read on the factors that drive human welfare and economic growth

  18. 31. srp 2017.

    Some evening visualization fun: traversing a sharp ridge in a toy error landscape (using noise perturbations to approximate gradients)

  19. 31. ožu 2017.

    This section from “A neural networks approach … to ask Barry Cottonfield to the junior prom” is too good 😂

  20. 27. ožu 2017.

    💡 New paper with using proximal algorithms to learn nonlinear cascade models in the retina is up!

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·