Tweets

You blocked @PyTorch

Are you sure you want to view these Tweets? Viewing Tweets won't unblock @PyTorch

  1. Jan 30

    Welcome OpenAI to the PyTorch community!

    Undo
  2. Retweeted
    Jan 28

    I'm happy to announce our latest work on self-supervised learning for . PASE+ is based on a multi-task approach useful for recognition. It will be presented at . paper: code: @Mila

    Show this thread
    Undo
  3. Jan 28

    Check out a fresh take on frontends from the creators of

    Undo
  4. Retweeted
    Jan 28

    We are proud to announce the release of Catalyst v20.01.3, DL/RL framework for - core architecture redesign - improved registry – Albumentations and SMP support - MultiPhaseRunner and GanRunner Working hard for Catalyst.RL 2.0 🚀

    Undo
  5. Retweeted
    Jan 27

    Kornia v0.2.0 is out ! We have introduced a new data augmentation module with strong GPU support, extended the set of color conversion algorithms, supporting GPU CI tests with v1.4.0, and much more. Happy coding !

    Show this thread
    Undo
  6. Jan 25

    Thank You for posting 10,000 replies on the forums and helping many hundreds of us!

    Undo
  7. Retweeted
    Jan 22

    code showing the generic learning setup and reproducing simple experiments is now available! Code: Project Page: Paper:

    Undo
  8. Retweeted
    Jan 22

    OpenMined + collaborating to advance open source software development. Learn about these talented teams on our blog: Many opportunities ahead. Join our Slack community to find out more!

    Undo
  9. Retweeted
    Jan 17

    Pyro 1.2 release adds poutine.reparam to rewrite models to improve geometry via: - neural transport - discrete cosine transform - auxiliary variable methods for Levy Stable distributions - conditional Gaussian HMMs - decentering - transform unwrapping

    Undo
  10. Retweeted
    Jan 17

    To make my research more reproducible, extensible and comparable to that of others & out of need to homogenize the language we use to express nn pruning methods, I contributed `nn.utils.prune` to 1.4 (see highlights ) Try it out & build on it! 🔥

    Undo
  11. Jan 15

    torchtext v0.5: improved data API, unsupervised text tokenization - bindings for SentencePiece - New: enwiki9, revisions to PennTreebank, WikiText103, WikiText2, IMDb (looking for feedback) Release Notes:

    Undo
  12. Jan 15

    torchaudio v0.4: more transforms, datasets, backend support - LibriSpeech and Common Voice loaders - Filters (biquad), batched / jittable transforms (MFCC, gain, dither), more augmentation - interactive speech recognition demo with voice detection

    Undo
  13. Jan 15

    torchvision v0.5: quantization, production - ResNets, MobileNet, ShuffleNet, GoogleNet and InceptionV3 now have quantized counterparts with pre-trained models, scripts for quantization-aware training. - All models are TorchScript-ready and ONNX-ready

    Undo
  14. Jan 15

    first release for Python 3.8: binaries for the entire matrix of Python 3.8 {pytorch, torchvision, torchaudio} configurations will be live by Jan 23rd. Some configurations are live already

    Show this thread
    Undo
  15. Jan 15

    v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: Release Notes: Last release for Python 2 (bye bye!)

    Show this thread
    Undo
  16. Retweeted
    Jan 15

    PyTorch Metric Learning now available on Anaconda! Installation: "conda install pytorch-metric-learning -c metric-learning" View more details here:

    Undo
  17. Retweeted
    Jan 13

    [News] Preferred Networks releases Optuna v1.0, the first major version of the open-source hyperparameter optimization framework for machine learning. Optimize your optimization.

    Undo
  18. Jan 14

    Learn how to automate most of the infrastructure work required to deploy PyTorch models in production using Cortex, an open source tool for deploying models as APIs on AWS.

    Undo
  19. Retweeted
    Jan 13

    I published a new article on the blog: Active Transfer Learning with PyTorch. Read about adapting Machine learning models with the knowledge that some data points will later get correct human labels, even if the model doesn't yet know the labels:

    Show this thread
    Undo
  20. Retweeted
    Jan 10

    Norse exploits the advantages of bio-inspired neural components, that are sparse and event-driven, a fundamental difference from artificial - based on

    Undo

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

    You may also like

    ·