Tweetovi

Blokirali ste korisnika/cu @pierrci

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @pierrci

  1. proslijedio/la je Tweet
    3. velj

    The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors. 🔥

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    16. sij

    As you may know, we just raised our series A. It wouldn't have been possible without the help and contribution of the fantastic community that we serve. So join us next week to celebrate in NYC around craft beers & classic video games! 🤗🤗🤗

    Poništi
  3. proslijedio/la je Tweet
    10. sij

    Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings➡️model inputs. Welcome 🤗Tokenizers: ultra-fast & versatile tokenization led by : -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    8. sij

    I prepared a new notebook for my Deep Learning class: Joint Intent Classification and Slot Filling with BERT: This a step by step tutorial to build a simple Natural Language Understanding system using the voice assistant dataset (English only).

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    20. pro 2019.

    We spend our time finetuning models on tasks like text classif, NER or question answering. Yet 🤗Transformers had no simple way to let users try these fine-tuned models. Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.

    , , i još njih 4
    Prikaži ovu nit
    Poništi
  6. 19. pro 2019.

    And if you're interested in the how-to of building a text generation app on Android, you can check the associated article! ✍️

    Prikaži ovu nit
    Poništi
  7. 19. pro 2019.

    Expanding our Lite models w/ the release of multiple GPT-2 (cc 👋) / DistilGPT2 versions, including FP16 and 8-bit quantized ones. They come with a little demo built w/ , check it out! 👇

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    18. pro 2019.
    Poništi
  9. proslijedio/la je Tweet

    Hugging Face raises $15 million to build the definitive natural language processing library by

    Poništi
  10. proslijedio/la je Tweet
    16. pro 2019.

    🔥 New in v2.2.2: you can now upload and share your models with the community directly from the library, using our CLI 🔥 1. Join here: 2. Use the CLI to upload: 3. Model is accessible to anyone using the `username/model_name` id🎉

    , , i još njih 4
    Prikaži ovu nit
    Poništi
  11. 11. pro 2019.

    Both quantized versions make use of the latest developments in TFLite, using the experimental MLIR-based converter and combining native TensorFlow operators with TFLite-optimized ones when supported. Thanks to all the TF and TFLite team for their amazing work! 👏 [4/4]

    Prikaži ovu nit
    Poništi
  12. 11. pro 2019.

    Let me know if you would be interested in a more detailed android devices' perfs benchmark on those models, including recent devices with GPU/neural chips acceleration (cc Santa 🎅👋) [3/4]

    Prikaži ovu nit
    Poništi
  13. 11. pro 2019.

    The 131MB model is the result of a FP16 quantization and the 64MB an 8-bits precision quantization (both post-training). Our tests on Galaxy S8 and Nexus 5X show that, while the FP16 model is bigger, it is also faster than the 8-bits precision one (size vs perfs dilemma 🤯) [2/4]

    Prikaži ovu nit
    Poništi
  14. 11. pro 2019.

    Distillation ✕ Quantization = 🚀 We're releasing 2 quantized versions of DistilBERT finetuned on SQuAD using Lite, resulting in model sizes of 131MB and 64MB. It's respectively 2x and 4x less than the non-quantized version! 🗜️🗜️🗜️🗜️ [1/4]

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    9. pro 2019.

    I heard some of you found the tokenizers too slow. I think you are going to love what we are cooking for you

    Poništi
  16. proslijedio/la je Tweet

    Encoder 🦄🤝🦄 decoders are now part of the 🤗 transformers library! I wrote a tutorial to explain how we got there and how to use them 👉 Bonus: a sneak peak into upcoming features ✨

    Poništi
  17. 19. stu 2019.

    The demo app is forked from the repo, so hat tip to and all the TensorFlow team for their work! 👏

    Prikaži ovu nit
    Poništi
  18. 19. stu 2019.

    Modern is ready for on device! We just published a demo of question-answering using SQuAD 1.1 running on and powered by DistilBERT, our 66M-parameters model thanks to Lite!

    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet

    State-of-the-Art Natural Language Processing in TensorFlow 2.0 Discover how you can use the Transformers library with TensorFlow to fine-tune a Transformer model. Read the blog ↓

    Poništi
  20. proslijedio/la je Tweet
    10. lis 2019.

    With 180+ papers mentioning 🤗 Transformers and its predecessors, it was high time to put out a real paper that people could cite. 🥳 🎉 With

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·