Tweetovi

Blokirali ste korisnika/cu @ptrblck_de

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ptrblck_de

  1. Prikvačeni tweet
    26. sij

    I just posted my 10,000th reply in the discuss forum! Thanks everyone for creating such a great community, for the guidance and mentorship I received, and for starting this journey.

    Poništi
  2. proslijedio/la je Tweet
    3. velj

    Added ImageNet validation results for 164 pretrained models on several datasets, incl ImageNet-A, ImageNetV2, and Imagenet-Sketch. No surprise, models with exposure to more data do quite well. Without extra, EfficientNets are holding their own.

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    23. sij

    Following the tradition, I am going to share all the course material for "STAT 453: Introduction to Deep Learning and Generative Models" I am teaching this semester :)

    Poništi
  4. proslijedio/la je Tweet
    15. sij

    v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: Release Notes: Last release for Python 2 (bye bye!)

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    11. pro 2019.
    Poništi
  6. proslijedio/la je Tweet
    6. pro 2019.

    I have some great Friday news for all and users. Thanks to this merged PR: , you can now (REALLY, THIS TIME!) go from to via zero-copy on . This is using a source build from master with PyTorch and CuPy 7.0.0.

    Poništi
  7. proslijedio/la je Tweet

    Super excited to welcome the PFN team to the community. With Chainer, CuPy, Optuna, MNCore, their innovations need no introduction. The community is going to get even more fun! :)

    Poništi
  8. proslijedio/la je Tweet
    4. pro 2019.

    [News] Preferred Networks (PFN) migrates its DL platform from Chainer to PyTorch. Chainer moves to maintenance support. PFN jointly works with Facebook and the OSS community to develop PyTorch. For more information, please look at the news release:

    Poništi
  9. proslijedio/la je Tweet
    26. stu 2019.

    Introducing the SHA-RNN :) - Read alternative history as a research genre - Learn of the terrifying tokenization attack that leaves language models perplexed - Get near SotA results on enwik8 in hours on a lone GPU No Sesame Street or Transformers allowed.

    The SHA-RNN is composed of an RNN, pointer based attention, and a “Boom” feed-forward with a sprinkling of layer normalization. The persistent state is the RNN’s hidden state h as well as the memory M concatenated from previous memories. Bake at 200◦F for 16 to 20 hours in a desktop sized oven.
    The attention mechanism within the SHA-RNN is highly computationally efficient. The only matrix multiplication acts on the query. The A block represents scaled dot product attention, a vector-vector operation. The operators {qs, ks, vs} are vectorvector multiplications and thus have minimal overhead. We use a sigmoid to produce {qs, ks}. For vs see Section 6.4.
    Bits Per Character (BPC) onenwik8. The single attention SHA-LSTM has an attention head on the second last layer and hadbatch size 16 due to lower memory use. Directly comparing the head count for LSTM models and Transformer models obviously doesn’tmake sense but neither does comparing zero-headed LSTMs against bajillion headed models and then declaring an entire species dead.
    Poništi
  10. proslijedio/la je Tweet
    25. stu 2019.

    We just released the paper and code for Mellotron: a multispeaker voice synthesis model that can make a voice emote and sing without emotive or singing training data.

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    25. stu 2019.

    And if you're a manager looking to support some of the best deep learning engineers in the world, we should talk about opportunities with the PyTorch team at .

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    22. stu 2019.

    Thomas has committed a number amazing changes to PyTorch, and has helped us immensely over the years. If you’re using PyTorch please consider either supporting his work or (if you’d like something more in return) hire him for a workshop or a feature you’d like to speed up!

    Poništi
  13. proslijedio/la je Tweet
    22. stu 2019.

    With my specialist PyTorch and ML training and consulting business picking up (yay, 🚀), I'm wondering whether hacking on PyTorch should be a past time or a part of the work week. Maybe crowdfunding can be part of the answer, so here is what I'm up to:

    Poništi
  14. 21. stu 2019.
    Poništi
  15. proslijedio/la je Tweet
    20. stu 2019.

    Example of Simpson's paradox. S : success P(S | A) = 78% P(S | B) = 83% Which treatment do you want? s: small, L: LARGE P(S | A, s) > P(S | B, s) P(S | A, L) > P(S | B, L) Now, which one do you want? Funny but no funny. This happened for real.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    15. stu 2019.

    When I published my PyTorch vs TensorFlow article, some people raised questions about whether it applied to non-NLP conferences. With NeurIPS posting all their papers, the answer is clear! Pytorch: 68 -> 166 papers Tensorflow: 91 -> 74 papers

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    15. stu 2019.

    Very impressive results: Momentum Contrast for Unsupervised Visual Representation Learning by Kaiming He et al. Better than supervised ImageNet pre-training for transfer learning on object detection (eg COCO).

    Poništi
  18. proslijedio/la je Tweet
    13. stu 2019.

    Research efforts in computer vision and are on the rise. To accelerate 3D research, NVIDIA releases Kaolin as a PyTorch library. See how researchers use Kaolin to move 3D models into the realm of neural networks.

    Poništi
  19. proslijedio/la je Tweet
    7. stu 2019.

    My graduation is approaching rapidly, and that means that I’ll be able to work full-time soon. If you’re looking for someone excited about building next generation tools and infra for scientific computing (and ML) then let's talk! Note: remote and Europe only 🚀

    Poništi
  20. 26. lis 2019.

    Sadly, I cannot join this meetup, but if you are in Munich or in the area, you should definitely join! The talks by and sound really interesting!

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·