Utku

@utkuevci

...üstü kalsın

Montréal, Québec
Vrijeme pridruživanja: travanj 2011.

Tweetovi

Blokirali ste korisnika/cu @utkuevci

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @utkuevci

  1. Prikvačeni tweet
    26. stu 2019.

    End-to-end training of sparse deep neural networks with little-to-no performance loss. Check out our new paper: “Rigging the Lottery: Making All Tickets Winners” (RigL👇) ! 📃 📁 with and

    80% sparse ResNet-50
    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    3. velj

    MIT's new CS class teaches you things that all the other classes don't teach you, like... 🖥️Shell tools and scripting 🖥️Vim 🖥️Data wrangling 🖥️Command-line environment 🖥️Version control Watch all 11 lectures for free here:

    Poništi
  3. 1. velj

    A general overview of compression methods. How can we find such models efficiently? I agree that deciding the right size/architecture for a given data is an import problem that we don’t have a generic answer for.

    Poništi
  4. 31. sij

    Check out our checkpoints at various sparsity levels in our GitHub repo!

    Prikaži ovu nit
    Poništi
  5. 31. sij

    How accurate can a ResNet-50 on ImageNet with only 250k of its original 25M weights be🧐? With RigL, accuracy of this 99% sparse network seems to increase with training time almost indefinitely📈! When does it stop?! 1x->53% 🤓 5x->61.8% 😎 10x->63.9% 🤔 40x->66.9% 😵 100x->?

    Prikaži ovu nit
    Poništi
  6. 8. sij
    Poništi
  7. proslijedio/la je Tweet
    6. sij

    We distill key components for pre-training representations at scale: BigTransfer ("BiT") achieves SOTA on many benchmarks with ResNet, e.g. 87.8% top-1 on ImageNet (86.4% with only 25 images/class) and 99.3% on CIFAR-10 (97.6% with only 10 images/class).

    Prikaži ovu nit
    Poništi
  8. 4. sij
    Poništi
  9. proslijedio/la je Tweet
    17. pro 2019.

    PSA: the Residency Program call for applicaitons will close on December 19! Check out for more information. Not a PhD in ML? That's fine! We're looking for talent from a wide range of backgrounds. .

    Poništi
  10. 17. pro 2019.

    Passing atlas mountains

    Poništi
  11. proslijedio/la je Tweet
    11. pro 2019.

    In case you missed our poster on MixMatch () today because you aren't in Vancouver or didn't survive the poster session stampede, here's the PDF: and here's a transcript of what I said to everyone who came by: ⬇️ 1/11

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    9. pro 2019.

    I'm happy to share my comments on the climate for men from my talk:

    Poništi
  13. 6. pro 2019.
    Poništi
  14. 29. stu 2019.

    That’s how fast codes. – mjesto: Café Résonance!

    Prikaži ovu nit
    Poništi
  15. 28. stu 2019.
    Poništi
  16. proslijedio/la je Tweet
    26. stu 2019.
    Poništi
  17. proslijedio/la je Tweet
    25. stu 2019.

    Understanding the generalization of ‘lottery tickets’ in neural networks An overview by and that discuss papers on lottery ticket hypothesis. How does “luck” play a role in a network's generalization property, and can we maximize “luck”?

    Poništi
  18. proslijedio/la je Tweet
    26. stu 2019.

    "Yüksek bir insan cemiyeti olan Türk milletinin tarihi bir vasfı da, sanatı sevmek ve onda yükselmektir." Mustafa Kemal Atatürk 🏆

    Poništi
  19. 26. stu 2019.

    Sparse neural-nets *can* be accelerated. Exciting work!

    Poništi
  20. proslijedio/la je Tweet
    26. stu 2019.

    “Fast Sparse ConvNets”, a collaboration w/ [], implements fast Sparse Matrix-Matrix Multiplication to replace dense 1x1 convolutions in MobileNet architectures. The sparse networks are 66% the size and 1.5-2x faster than their dense equivalents.

    Prikaži ovu nit
    Poništi
  21. 26. stu 2019.

    7) Finally, we investigate the energy landscape of sparse networks. Our results suggest that training with static connectivity converges to bad local minima, while RigL allows us to escape such bad critical points. For more results and discussions: check out the paper!

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·