Teddy Ampian

@TeddyAmpian

UWisc Data Science Masters student

Denver, CO
Vrijeme pridruživanja: lipanj 2011.

Tweetovi

Blokirali ste korisnika/cu @TeddyAmpian

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @TeddyAmpian

  1. proslijedio/la je Tweet
    3. velj

    Oldies but goldies: G. H. Golub, Christian Reinsch, Singular value decomposition and least squares solutions, 1970. The most popular algorithm to compute efficiently the SVD decomposition.

    Poništi
  2. proslijedio/la je Tweet

    Reevaluating the Role of Persistent Neural Activity in Short-Term Memory, by Nicolas Masse, Matthew Rosen, & David Freedman

    Poništi
  3. proslijedio/la je Tweet
    1. velj

    Just came across ’s great blog and really enjoyed reading one of his posts on long short-term memory (LSTM): Made me intereted in learning more about .

    Poništi
  4. proslijedio/la je Tweet
    1. velj

    "So I have decided to stop attempting to generate new mathematics, and concentrate instead on carefully checking “known” mathematics on a computer."

    Poništi
  5. proslijedio/la je Tweet
    1. velj

    A practical definition of opportunity cost: If you spend too much time working on good things, then you don’t have much time left to work on great things. Understanding opportunity cost means eliminating good uses of time. And that's what makes it hard.

    Poništi
  6. proslijedio/la je Tweet
    28. sij

    +, Σ, and ∫ are just different evolutions of the same Pokémon

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    27. sij

    I went and wrote an introduction on Bayesian Neural Networks based on a lecture by David MacKay. Included are examples in JAX/Python. Give us a click so we can finally hit some of our OKRs for this quarter :P

    Poništi
  8. proslijedio/la je Tweet
    25. sij

    Looking for a post-doc. The research topics can be very broad including graph representation learning, graph neural networks, drug discovery, knowledge graphs, deep generative models, and natural language understanding. Pls email me if you are interested.

    Poništi
  9. proslijedio/la je Tweet
    28. sij

    How symmetric is too symmetric for large quantum speedups?

    Poništi
  10. proslijedio/la je Tweet
    28. sij

    Heya 👋 Want to get research experience at CMU? If you're interested in PL, distributed systems, software engineering, etc we have a program that pays you to come learn to do research with us over the summer! Happy to answer Qs about it! RTs welcome🤗

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet

    You know how you sometimes grab a travel guide to a place you've lived in a while and know well, and you read a confident and savvy-sounding description of something that's utterly unrecognizable to you? That's how I feel about most tech journalism.

    Poništi
  12. proslijedio/la je Tweet
    22. sij

    My blogpost on how & why we use convolutional neural networks as a model of the visual system is probably the most read thing I've ever written and it's now been expanded & updated into a proper review article, complete with 136 references & 5 new figures!

    Poništi
  13. proslijedio/la je Tweet
    15. sij

    🎉 2019 🎉 was quite the year for Deep Reinforcement Learning. In todays blog post I list my top 10 papers 🦄💻🧠 What was your favourite paper? Let me know!

    Poništi
  14. proslijedio/la je Tweet
    14. sij

    Norge har i dag fått en nasjonal strategi for kunstig intelligens. Gratulerer! Veldig glad for at regjeringen vil styrke grunnleggende IKT-forskning og satse på utdanning! Takk !

    Poništi
  15. proslijedio/la je Tweet

    Under-rated question for better planning and prioritization: Will this problem go away if I do nothing? More frequently than you suspect, it will.

    Poništi
  16. proslijedio/la je Tweet
    2. sij

    Seeing twitter debate about the replicability of the new DeepMind study of ML for breast cancer screening vs. the NYU study from October: Did some light digging in to compare/contrast… 1/n

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    26. pro 2019.

    Bayesian methods are *especially* compelling for deep neural networks. The key distinguishing property of a Bayesian approach is marginalization instead of optimization, not the prior, or Bayes rule. This difference will be greatest for underspecified models like DNNs. 1/18

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    29. pro 2019.

    Some people in ML share the illusion that models expressed symbolically will necessarily/magically generalise better compared to, for example, parametric model families fit on the same data. This belief seems to come from a naive understanding of mathematics 1/5

    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet
    23. pro 2019.

    Vis a vis the Marcus / Bengio , I just don't buy the implied strict separation between "System 1" and "System 2".

    Poništi
  20. proslijedio/la je Tweet
    23. pro 2019.

    Did Marcus quote a 90s paper just to flex on Bengio 🤔?

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·