Miles Cranmer

@MilesCranmer

Trying to automate astro/physics research with AI. I'm an astrophysics PhD student at designing interpretable and probabilistic ML.

Princeton, NJ
Vrijeme pridruživanja: rujan 2011.

Tweetovi

Blokirali ste korisnika/cu @MilesCranmer

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @MilesCranmer

  1. Prikvačeni tweet
    14. pro 2019.

    My talk video and slides are up here starting at 43:35 I explain our work extracting learned physical laws from graph networks in . I try to make it digestible for a broad audience!

    Poništi
  2. 31. sij

    This is a nice package for making pyplot animations more intuitive: All you do is call "camera.snap()" every time you re-do the plot.

    Poništi
  3. proslijedio/la je Tweet
    29. sij

    This is the sharpest movie of the Sun ever made. Even at this fine resolution, the scale is enormous; each plasma cell here is about the size of Texas. via

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    6. pro 2019.
    Prikaži ovu nit
    Poništi
  5. 28. sij

    Want to do distributed hyperopt with nothing but a shared folder between nodes/processes? Try this example I wrote: Quick and dirty but: no extra packages, no master nodes, and complete resilience to crashes and restarts.

    Poništi
  6. 23. sij

    If you use Stack Overflow for coding, try my vim plugin "Googling Stack Overflow" to do queries and paste code directly in the editor

    Poništi
  7. 21. sij

    BayesNet seems like a really nice LaTeX package for drawing clean probabilistic graphical models with minimal effort. Wish I heard about it earlier!

    Poništi
  8. 18. sij

    Awesome interactive demos of different MCMC algorithms:

    Poništi
  9. proslijedio/la je Tweet
    10. sij
    Odgovor korisnicima i sljedećem broju korisnika:

    and here's a million parameters. 10 chains/100 samples. About 30% of proposals are accepted.

    Poništi
  10. 10. sij

    3/ Answering 's question about the weight distribution (+ 's comment on modality?), here's what the joint distribution looks like for 4 weights (16,000 samples) from the HMC. It's not converged and I doubt meaningful but it's interesting. It's not uncorrelated...

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    10. sij
    Odgovor korisnicima

    Thanks . Do you know of good references for statistical implications of these properties? Any thoughts on SWAG?

    Poništi
  12. 10. sij

    2/2 Just wondering why Monte Carlo dropout/Bayes by Backprop are the most common (?) methods, despite their assumption of the weight posterior being uncorrelated. Directly MCMC'ing the weights seems like the simpler thing to do... and also seems to work well

    Prikaži ovu nit
    Poništi
  13. 10. sij

    1/2 Why isn't it more common to do explicit Hamiltonian MCMC on a Bayesian Neural Network's weights, with eg the initial condition = the loss minima found via SGD? I'm playing around with one in JAX and it seems to be working reasonably even with 5 chains:

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    9. sij

    Training Neural SDEs: We worked out how to do scalable reverse-mode autodiff for stochastic differential equations. This lets us fit SDEs defined by neural nets with black-box adaptive higher-order solvers. With , and .

    Prikaži ovu nit
    Poništi
  15. 8. sij

    7/7 Finally, thanks et al. for FFJORD; it's a really nice algorithm and code!

    Prikaži ovu nit
    Poništi
  16. 8. sij

    6/7 This approach is very applicable to iteratively learning distributions in latent spaces in the presence of noise or structured bias (dust) as we've shown in the paper.

    Prikaži ovu nit
    Poništi
  17. 8. sij

    5/7 More generally I think normalizing flows are very underused in astro . One often encounters highly non-Gaussian distributions in high-dimensional spaces, and GMMs/grid-based methods will hurt accuracy (and GPs are hard to scale and not as architecturally flexible).

    Prikaži ovu nit
    Poništi
  18. 8. sij

    4/7 I chose n=128 for Extreme Deconvolution based on its previous uses, e.g., arxiv:1706.05055. Note also that this example observed CMD does not have any dust or iterative estimates for simplicity. The example in our paper does, however.

    Prikaži ovu nit
    Poništi
  19. 8. sij

    3/7 To recreate this figure (you can optionally re-train the normalizing flow if you wish, or load it; it isn't actually converged yet), here's some code: . The flow is vanilla except for treatment of noise.

    Prikaži ovu nit
    Poništi
  20. 8. sij

    2/7 The science case for work like this is building more accurate and scalable data-driven priors on photometry to improve distance estimates to stars (or other sources) in large scale surveys. Work with , Lauren Anderson, , .

    Prikaži ovu nit
    Poništi
  21. 8. sij

    1/7 figure I included on my poster demonstrating normalizing flows (this one uses 's group's FFJORD) vs extreme deconvolution-generated GMMs as a reconstruction method for noisy color-magnitude diagrams in astronomy. (our paper here: )

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·