Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @moi_anthony
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @moi_anthony
-
Anthony MOI proslijedio/la je Tweet
Guest blog post from
@huggingface angel investor@madlag on sparse tensors in neural nets. I am VERY excited by what's going to happen in sparsity-land this year
https://medium.com/huggingface/is-the-future-of-neural-networks-sparse-an-introduction-1-n-d03923ecbd70 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
This is a game changer to me. Every NLP researcher or practitioner understands the pain to tokenize and process texts differently for different modern NLP architectures. Huggingface now offers a streamlined process to integrate tokenizers and processors with their amazing models!https://twitter.com/julien_c/status/1216768092484907012 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Introducing Tokenizers: ultra-fast, extensible tokenization for state-of-the-art NLP
https://github.com/huggingface/tokenizers …pic.twitter.com/M8eT59A3gg
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings
model inputs.
Welcome
Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony: -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...pic.twitter.com/1TfJ1Hm1xx
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je TweetHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Anthony MOI proslijedio/la je Tweet
working on a karaoke program that replaces lyrics with matching lines from any source text here's journey + the x-filespic.twitter.com/VHj1e5QKe8
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
If you're using Transformers from source, we've rolled out 2 nice beta features (TBR in January)
Ultra-fast Bert/GPT2 tokenizers (up to 80x faster)
Easy/versatile sequence generation for generative models: top-k/nucleus/temperature sampling, penalized/greedy, beam search...pic.twitter.com/KNAmDbQPk3
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
We spend our time finetuning models on tasks like text classif, NER or question answering. Yet
Transformers had no simple way to let users try these fine-tuned models.
Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.pic.twitter.com/ZcPTXOJsuS
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet

Series A!! 
Solving Natural language is going to be the biggest achievement of our lifetime, and is the best proxy for Artificial intelligence.
Not one company, even the Tech Titans, will be able to do it by itself – the only way we'll achieve this is working togetherpic.twitter.com/z2jzhQZkGE
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Hugging Face raises $15 million to build the definitive natural language processing library https://tcrn.ch/2rVocLZ by
@romaindilletpic.twitter.com/SMjtiDino1
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
New in v2.2.2: you can now upload and share your models with the community directly from the library, using our CLI
1. Join here: https://huggingface.co/join
2. Use the CLI to upload: https://github.com/huggingface/transformers#Quick-tour-of-model-sharing …
3. Model is accessible to anyone using the `username/model_name` id
pic.twitter.com/ZdVDeOMmQt
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
おはようござえます、日本の友達 Hello, Friends from Japan
!
Thanks to @NlpTohoku, we now have a state-of-the-art Japanese language model in Transformers, `bert-base-japanese`. Can you guess what the model outputs in the masked LM task below?pic.twitter.com/XIBUu7wrex
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
So, we read ALBERT.
@remilouf took some notes for you
Paper: https://arxiv.org/abs/1909.11942
Also in
transformers: https://github.com/huggingface/transformers …pic.twitter.com/LLaNema3mc
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Distillation ✕ Quantization =
We're releasing 2 quantized versions of DistilBERT finetuned on SQuAD using @TensorFlow Lite, resulting in model sizes of 131MB and 64MB. It's respectively 2x and 4x less than the non-quantized version!


[1/4]
https://github.com/huggingface/tflite-android-transformers/ …pic.twitter.com/maaQxxPR87
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
(abstractive) summarization of your documents has never been easier than with
@huggingface's transformers: https://github.com/huggingface/transformers/tree/master/examples/summarization … [credits go to Yang Liu and Mirella Lapata: https://github.com/nlpyang/PreSumm and paper linked]pic.twitter.com/jcLeErE3F3
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I heard some of you found the tokenizers too slow. I think you are going to love what we are cooking for you
@huggingfacepic.twitter.com/NPvxZqlO6N
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
How do you say "faster!" in 104 languages? Ask
Transformers!
Please welcome **Distil-mBERT**, 104 languages, 92% of mBERT’s performance on XNLI, 25% smaller, and twice as fast.
>https://github.com/huggingface/transformers/tree/master/examples/distillation …
Come talk to me about it at #NeurIPS2019!
pic.twitter.com/6AUJmA8Eoz
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Encoder


decoders are now part of the
transformers library!
I wrote a tutorial to explain how we got there and how to use them
https://link.medium.com/RTvKeSqo71
Bonus: a sneak peak into upcoming features
pic.twitter.com/8q1VtOMeIm
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Transformers v2.2 is out, with *4* new models and seq2seq capabilities! ALBERT is released alongside CamemBERT, implemented by the authors, DistilRoBERTa (twice as fast as RoBERTa-base!) and GPT-2 XL! Encoder-decoder with
Model2Model
Available on https://github.com/huggingface/transformers/releases/tag/v2.2.0 …pic.twitter.com/r6M39jYPHf
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Anthony MOI proslijedio/la je Tweet
Tired: Machines playing video games & Go Wired: Surrealist Transformers (playing exquisite corpse)pic.twitter.com/TZJBZaxCAP
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
for 2k20!