Tweetovi

Blokirali ste korisnika/cu @SingularMattrix

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @SingularMattrix

  1. proslijedio/la je Tweet

    "I return to my meditations and the long-interrupted duty of my correspondence, and I beg your forgiveness for not having answered your last letter sooner. It is usually the case that I postpone longer than I ought the things with which I wish to take the most pains." —Leibniz

    Poništi
  2. proslijedio/la je Tweet
    29. sij

    “I’ve CC’d in my boss” - professional - corporate - mildly threatening “You wanna say that in front of Greg?” - confident - threat level 9000 - who is Greg and what is he capable of

    Poništi
  3. proslijedio/la je Tweet
    22. sij

    The whole thread on BNNs and blog post by and reminded me of the "First, you rob a bank..." characterization by Yasser Abu Mostafa Apologies to my Bayesian friends who may find it unfair.

    Poništi
  4. proslijedio/la je Tweet
    24. sij

    We just released a large public dataset of >10k echocardiogram videos with clinical annotations . Great resource for the medical ML community!

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    24. sij

    Research on the Neural Tangent Kernel (NTK) almost exclusively uses a non-standard neural network parameterization, where activations are divided by sqrt(width), and weights are initialized to have variance 1 rather than variance 1/width.

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    24. sij

    Check out the new release of NumPyro . This includes an implementation of 's Block Neural Autoregressive Flow in JAX and its usage in MCMC. Also, let us know what you think about the new Sample Adaptive MCMC sampler.

    Poništi
  7. proslijedio/la je Tweet
    23. sij

    Hello, world! This is the official account for NumPy, the fundamental package for scientific computing with Python. Follow us for news, info and content related to NumPy!

    Poništi
  8. proslijedio/la je Tweet
    22. sij

    1/5 In one my Residency projects we used CNNs to reparameterize structural optimization (w/ ). Our approach worked best on 99/116 structures. I just finished a blog post with GIFs, visualizations, and links to code + Colab.

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    22. sij

    Have you ever wondered what will be the ML frameworks of the '20s? In this essay, I examine the directions AI research might take and the requirements they impose, concluding with an overview of what I believe to be two strong candidates: JAX and S4TF.

    Poništi
  10. proslijedio/la je Tweet
    21. sij

    Interesting discussion on JVPs (Jacobian-vector products) used for forward differentiation vs. VJPs used for reverse differentiation in JAX, and the pushback / pullback views

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    20. sij

    Wish more early-stage frameworks would adopt this philosophy. 'Understanding' is a far bigger bottleneck than the availability of XYZ feature. And Copy-Paste is a reality that should be embraced and leverage.

    Poništi
  12. proslijedio/la je Tweet

    We used JAX for competitive gradient descent (CGD) with Hessian-vector products. Mixed mode differentiation in JAX makes this efficient (just twice cost of backprop). We used CGD for training GANs and for constrained problems in RL. This library will be very useful

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    20. sij

    Flax: A neural network library for JAX designed for flexibility (pre-release)

    Poništi
  14. proslijedio/la je Tweet
    19. sij
    Poništi
  15. proslijedio/la je Tweet
    19. sij

    Ok, I got a pretty good port of these Maxwell equation figures into Python. Heavy use of JAX and . I'll tidy up the code post it here later.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    18. sij

    Awesome interactive demos of different MCMC algorithms:

    Poništi
  17. proslijedio/la je Tweet
    17. sij

    I JUST GOT REJECTED FOR SUBMITTING A PERFECT PAPER!

    Poništi
  18. proslijedio/la je Tweet
    17. sij

    This review on normalizing flows is excellent. It's full of clear writing, precise claims, and useful connections.

    Poništi
  19. proslijedio/la je Tweet
    16. sij

    Introducing Reformer, an efficiency optimized architecture, based on the Transformer model for language understanding, that can handle context windows of up to 1 million words, all on a single accelerator with only 16GB of memory. Read all about it ↓

    Poništi
  20. proslijedio/la je Tweet

    My answer to How important are techniques like probabilistic programming, automatic differentiation, and ODE solvers for the future of scientific computing? Will they outlive the deep learning hype?

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·