Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @pierrci
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @pierrci
-
Pierric Cistac proslijedio/la je Tweet
The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors.
https://github.com/huggingface/transformers/releases/tag/v2.4.0 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
As you may know, we just raised our series A. It wouldn't have been possible without the help and contribution of the fantastic
#NLProc community that we serve. So join us next week to celebrate in NYC around craft beers & classic video games!

https://www.eventbrite.com/e/hugging-face-nlp-community-party-registration-86625931493 …pic.twitter.com/8RCnuJ3kyd
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings
model inputs.
Welcome
Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony: -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...pic.twitter.com/1TfJ1Hm1xx
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
I prepared a new notebook for my Deep Learning class: Joint Intent Classification and Slot Filling with BERT: https://nbviewer.jupyter.org/github/m2dsupsdlclass/lectures-labs/blob/master/labs/06_deep_nlp/Transformers_Joint_Intent_Classification_Slot_Filling_rendered.ipynb … This a step by step tutorial to build a simple Natural Language Understanding system using the
@snips voice assistant dataset (English only).Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
We spend our time finetuning models on tasks like text classif, NER or question answering. Yet
Transformers had no simple way to let users try these fine-tuned models.
Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.pic.twitter.com/ZcPTXOJsuS
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
And if you're interested in the how-to of building a text generation app on Android, you can check the associated
@TDataScience article!
https://towardsdatascience.com/on-device-machine-learning-text-generation-on-android-6ad940c00911 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Expanding our
@TensorFlow Lite models w/ the release of multiple GPT-2 (cc@OpenAI
) / DistilGPT2 versions, including FP16 and 8-bit quantized ones. They come with a little @Android demo built w/@kotlin, check it out!
https://github.com/huggingface/tflite-android-transformers/tree/master/gpt2 …pic.twitter.com/oG1cgBnkDnPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je TweetHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Pierric Cistac proslijedio/la je Tweet
Hugging Face raises $15 million to build the definitive natural language processing library https://tcrn.ch/2rVocLZ by
@romaindilletpic.twitter.com/SMjtiDino1
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
New in v2.2.2: you can now upload and share your models with the community directly from the library, using our CLI
1. Join here: https://huggingface.co/join
2. Use the CLI to upload: https://github.com/huggingface/transformers#Quick-tour-of-model-sharing …
3. Model is accessible to anyone using the `username/model_name` id
pic.twitter.com/ZdVDeOMmQt
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Both quantized versions make use of the latest developments in TFLite, using the experimental MLIR-based converter and combining native TensorFlow operators with TFLite-optimized ones when supported. Thanks to all the TF and TFLite team for their amazing work!
[4/4]Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Let me know if you would be interested in a more detailed android devices' perfs benchmark on those models, including recent devices with GPU/neural chips acceleration (cc
@huggingface Santa
) [3/4]Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The 131MB model is the result of a FP16 quantization and the 64MB an 8-bits precision quantization (both post-training). Our tests on Galaxy S8 and Nexus 5X show that, while the FP16 model is bigger, it is also faster than the 8-bits precision one (size vs perfs dilemma
) [2/4]Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Distillation ✕ Quantization =
We're releasing 2 quantized versions of DistilBERT finetuned on SQuAD using @TensorFlow Lite, resulting in model sizes of 131MB and 64MB. It's respectively 2x and 4x less than the non-quantized version!


[1/4]
https://github.com/huggingface/tflite-android-transformers/ …pic.twitter.com/maaQxxPR87
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
I heard some of you found the tokenizers too slow. I think you are going to love what we are cooking for you
@huggingfacepic.twitter.com/NPvxZqlO6N
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
Encoder


decoders are now part of the
transformers library!
I wrote a tutorial to explain how we got there and how to use them
https://link.medium.com/RTvKeSqo71
Bonus: a sneak peak into upcoming features
pic.twitter.com/8q1VtOMeIm
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The demo app is forked from the https://github.com/tensorflow/examples … repo, so hat tip to
@Tian_Lin@SarahSirajuddin@rajatmonga and all the TensorFlow team for their work!
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Modern
#NLP is ready for on device! We just published a demo of question-answering using SQuAD 1.1 running on@Android and powered by DistilBERT, our 66M-parameters model thanks to@tensorflow Lite! https://github.com/huggingface/tflite-android-transformers …pic.twitter.com/GhTGS0crWnPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
State-of-the-Art Natural Language Processing in TensorFlow 2.0 Discover how you can use the
@HuggingFace Transformers library with TensorFlow to fine-tune a Transformer model. Read the blog ↓https://medium.com/tensorflow/using-tensorflow-2-for-state-of-the-art-natural-language-processing-102445cda54a …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Pierric Cistac proslijedio/la je Tweet
With 180+ papers mentioning
Transformers and its predecessors, it was high time to put out a real paper that people could cite.
https://arxiv.org/abs/1910.03771
With @LysandreJik@SanhEstPasMoi@julien_c@ClementDelangue@moi_anthony@pierrci@remilouf@MorganFunto@jamieabrewpic.twitter.com/oJT9lbLbyg
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.