Tweetovi

Blokirali ste korisnika/cu @barret_zoph

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @barret_zoph

  1. proslijedio/la je Tweet
    18. stu 2019.

    RandAugment: Practical automated data augmentation with a reduced search space Decreasing the search space in a clever way avoids the need to perform highly expensive computation search. ie. NAS→EfficientNets, AutoAugment→RandAugment Might be useful for domain randomization.

    Poništi
  2. proslijedio/la je Tweet
    13. pro 2019.

    Tomorrow I'll be talking about JAX MD: a hardware accelerated, end-to-end differentiable, molecular dynamics library at the ML4PS at 9:20am (along with tons of amazing speakers). Paper: Code: Colab:

    Prikaži ovu nit
    Poništi
  3. 1. pro 2019.

    Slides and video of my talk at the Neural Architects workshop at ICCV this year!

    Poništi
  4. 22. stu 2019.

    This is a great description of RandAugment! Thanks so much.

    Poništi
  5. 18. stu 2019.

    *New paper* RandAugment: a new data augmentation. Better & simpler than AutoAugment. Main idea is to select transformations at random, and tune their magnitude. It achieves 85.0% top-1 on ImageNet. Paper: Code:

    Poništi
  6. proslijedio/la je Tweet
    29. lip 2019.

    Automatically learned data augmentation policies can train more accurate models using fewer labeled examples, letting you stretch the amount you get from each labeled example. Work by 's , , Golnaz Ghiasi, Tsung-Yi Lin, Jonathon Shlens, &

    Poništi
  7. proslijedio/la je Tweet
    26. lip 2019.

    Data augmentation is even more crucial for detection. We present AutoAugment for object detection, achieving SOTA on COCO validation set (50.7 mAP). Policy transfers to different models & datasets. Paper: , Code: , details in thread.

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    18. lip 2019.

    Nice article in Daily about AutoAugment. Thanks Ralph Anzarouth for the interview! Come see our talk [1-1A] and poster [1-1P-12] if you want to learn more.

    Poništi
  9. proslijedio/la je Tweet
    22. tra 2019.

    Exciting new work on replacing convolutions with self-attention for vision. Our paper shows that full attention is good, but loses a few percents in accuracy. And a middle ground that combines convolutions and self-attention is better. Link:

    Poništi
  10. proslijedio/la je Tweet
    22. tra 2019.

    Automatic Speech Recognition (ASR) struggles in the absence of an extensive volume of training data. We present SpecAugment, a new approach to augmenting audio data that treats it as a visual problem rather than an audio one. Learn more at →

    Poništi
  11. proslijedio/la je Tweet
    22. tra 2019.

    Wanted to apply AutoAugment to speech, but a handcrafted augmentation policy already improves SOTA. Idea: randomly drop out certain time & frequency blocks, and warp input spectrogram. Results: state-of-art on LibriSpeech 960h & Switchboard 300h. Link:

    Poništi
  12. proslijedio/la je Tweet
    16. tra 2019.

    Latest version of the AutoAugment paper is up: . Stop by our oral presentation at CVPR to learn more! Joint work with Vijay Vasudevan and .

    Poništi
  13. proslijedio/la je Tweet
    2. stu 2017.

    AutoML for large scale image classification and object detection

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·