Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @Wietsedv
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @Wietsedv
-
Wietse de Vries proslijedio/la je Tweet
"How to do machine learning efficiently". There's so much to love about this wonderful article. https://medium.com/hackernoon/doing-machine-learning-efficiently-8ba9d9bc679d …pic.twitter.com/n6otKKP3gJ
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Meena has exactly the same core issue as ELIZA: it doesn't build a model of what it or the interlocutor has said, and it often contradicts what happened a few turns earlier. Topic without understanding in 1965, topic without understanding in 2020. Here's a sample:pic.twitter.com/xiCFd3Dv84
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors.
https://github.com/huggingface/transformers/releases/tag/v2.4.0 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Today we celebrate the first day of Malvina Nissim as a full professor! Well deserved!
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
NLP Newsletter #3: Flax, Thinc, Language-specific BERT models, Meena, Flyte, LaserTagger,…
featuring: @AnimaAnandkumar,@techno246,@hen_str,@jeremyakahn,@lexfridman,@iamtrask,@seb_ruder,@huggingface,. GitHub: https://github.com/dair-ai/nlp_newsletter … Medium:https://medium.com/dair-ai/nlp-newsletter-flax-thinc-language-specific-bert-models-meena-flyte-lasertagger-4f7da04a9060 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.https://goo.gle/38XfRXU
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shamelessly advertised BERTje at
@clin30 yesterday.
pic.twitter.com/v1RZr7KGgs
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Proud to say that our BERTje from
@GroNlp is now available as the default Dutch BERT model in Transformers by@huggingface! Some comparisons of BERTje with mBERT, BERT-NL and RobBERT are available on https://github.com/wietsedv/bertje (more coming soon).Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Transformers 2.4.0 is out
- Training transformers from scratch is now supported
- New models, including *FlauBERT*, Dutch BERT, *UmBERTo*
- Revamped documentation
- First multi-modal model, MMBT from @facebookai, text & images Bye bye Python 2
https://github.com/huggingface/transformers/releases …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Looking at the replies, there are a lot of BERT models that I missed: -
RuBERT https://arxiv.org/abs/1912.09582
-
BETO https://github.com/dccuchile/beto
-
BERTje https://arxiv.org/abs/1912.09582
-
Portuguese BERT https://github.com/neuralmind-ai/portuguese-bert …
-
https://github.com/dbmdz/berts/blob/master/README.md …
Thanks everyone!
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Transfer learning is increasingly going multilingual with language-specific BERT models: -
German BERT https://deepset.ai/german-bert
-
CamemBERT https://arxiv.org/abs/1911.03894 , FlauBERT https://arxiv.org/abs/1912.05372
-
AlBERTo http://ceur-ws.org/Vol-2481/paper57.pdf …
-
RobBERT https://arxiv.org/abs/2001.06286 Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
So awesome to see the community effort on multi-lingual NLP! In the past month, models in Dutch, Finnish, French, German, Italian, Japanese, Mandarin and Spanish have been added to transformers. Check them all here: https://huggingface.co/models pic.twitter.com/DooTvFx6Ow
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Wietse de Vries proslijedio/la je Tweet
Hot off the press: Bertje. We collected a large and diverse corpus of Dutch and trained a monolingual BERT model. The model is available for download. Paper: http://arxiv.org/abs/1912.09582 joint work by
@WietseDEV me@AriannaBisazza@tommaso_caselli Gertjan van Noord & Malvina Nissimpic.twitter.com/c1aRvR8QOv
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.