Theodore

@taehobyo

뭐라할까

Seocho-gu, Republic of Korea
Vrijeme pridruživanja: prosinac 2009.

Tweetovi

Blokirali ste korisnika/cu @taehobyo

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @taehobyo

  1. proslijedio/la je Tweet
    27. sij

    -2007, The Road to Quantum Artificial Intelligence -2015, Quantum algorithms: an overview -2018, Machine learning & artificial intelligence in the quantum domain: a review of recent progress(download not pdf)

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    24. pro 2019.

    Some folks still seem confused about what deep learning is. Here is a definition: DL is constructing networks of parameterized functional modules & training them from examples using gradient-based optimization....

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    13. stu 2019.

    We've just open-sourced the code for Stacked Capsule Autoencoders (NeurIPS '19): joint work with , and

    Poništi
  4. proslijedio/la je Tweet

    80 years of AI research represented as connectionist (neural nets) vs. symbolic (rule-based). v/

    Poništi
  5. proslijedio/la je Tweet
    27. lis 2019.

    DiffTaichi: Differentiable Programming for Physical Simulation “Using our differentiable programs, neural network controllers are typically optimized within only tens of iterations.” When we have good priors about the world, it makes sense to use them!

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    13. ruj 2019.

    The paper with its finding that the worse Stat forecasting method was more accurate than the best of the ML ones has passed the 100,000 mark of views/downloads. None of those who have read/downloaded it has challenged its finding. We are still waiting!

    Poništi
  7. proslijedio/la je Tweet
    16. ruj 2019.

    "Are labels required for improving adversarial robustness?" TL;DR: no! With only 10% of CIFAR10 labels, UAT has almost no drop in robust accuracy. With additional unlabeled data, UAT obtains SOTA robust accuracy. Paper: Code:

    Poništi
  8. proslijedio/la je Tweet
    31. kol 2019.
    Poništi
  9. proslijedio/la je Tweet
    6. ruj 2019.

    Thrilled to be able to share what I've been working on for the last year - solving the fundamental equations of quantum mechanics with deep learning!

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    6. ruj 2019.

    In our new blog post, we review how brains replay experiences to strengthen memories, and how researchers use the same principle to train better AI systems:

    Poništi
  11. proslijedio/la je Tweet
    31. kol 2019.

    Can we scale gradient-based meta-learning? In Warped Gradient Descent, we meta-learn a geometry over the joint task parameter distribution. We can learn optimisers for RNNs and against catastrophic forgetting. W/ A. Rusu, , H. Yin, . 👉

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    13. kol 2019.

    Project Euphonia is a speech-to-text transcription model for those with atypical speech. In a new paper, learn how researchers are collaborating with the community to develop Euphonia for those with ALS or other speech impairments.

    Poništi
  13. proslijedio/la je Tweet
    31. srp 2019.

    GENESIS is the first fully probabilistic model for unsupervised image segmentation with amortized inference, developed by , , O. Parker-Jones and myself:

    Poništi
  14. proslijedio/la je Tweet
    1. kol 2019.

    We recently developed a unified service system that allows Population Based Training to be scaled to diverse machine learning applications within Alphabet. We'll be presenting this paper at , August 2019 in Anchorage, Alaska!

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    5. kol 2019.

    The conventional wisdom that Gaussian processes are slow was broadly true 10 years ago, but not today. In GPyTorch, for example, even an exact GP on >100k points can only take minutes. By contrast, a modern NN on 50k points takes ~8 hours with a good GPU.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    24. srp 2019.

    Looking forward to giving a talk at UAI2019! Thanks to for inviting me, and my wonderful collaborators at . Talk will be about KL regularised RL, multitask learning, meta learning, and neural processes. Probabilistic learning FTW!

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    26. srp 2019.

    Starting fall 2019, I will join the Department of Biomedical Engineering at McGill's Faculty of Medicine. Please mail me (dan??obz??k@gmail.com) for PhD, post-doc, or research engineer positions in my new group in Montreal.

    Poništi
  18. proslijedio/la je Tweet
    23. srp 2019.
    Odgovor korisnicima

    I don't see any Julian Jaynes references...

    Poništi
  19. proslijedio/la je Tweet
    22. srp 2019.
    Poništi
  20. proslijedio/la je Tweet
    17. srp 2019.

    Subspace Inference for Bayesian Deep Learning. In our new paper we construct low dimensional subspaces for scalable Bayesian inference on modern deep nets (with code)! Mode connectivity makes a guest appearance.

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·