Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @douwekiela
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @douwekiela
-
Douwe Kiela proslijedio/la je Tweet
Stoked to share our work on improving sample efficiency in language learning with limited supervision and self-play, accepted at
#ICLR2020 Thanks to the awesome collaborators@ryan_t_lowe@j_foerst@douwekiela Joelle Pineau Special thanks to@aggielaz for the motivation (1/4)Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Supervised multimodal bitransformers now available in the awesome HuggingFace Transformers library!https://twitter.com/huggingface/status/1223267748542808064 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Douwe Kiela proslijedio/la je Tweet
I wrote a blog post explaining the basics of emergent communication and why it's a fascinating field. It's a great read if you're interested in the workshop on emergent communication at NeurIPS this Saturday
https://hackmd.io/L4aPLXiETuyQKoAQpudtug …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
To paraphrase Shakespeare, there is something rotten in the state of the art. Adversarial NLI was collected using HAMLET (human-and-model-in-the-loop entailment training) to create a "moving post" dynamic dataset, rather than a static benchmark that will saturate quickly.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Excited (in my 1st tweet ever!) to announce Adversarial NLI: a new large-scale benchmark dataset for NLU, and a challenge to the community. Great job by
@EasonNie, together with@adinamwilliams@em_dinan@mohitban47 and@jaseweston. https://arxiv.org/abs/1910.14599Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
- Training transformers from scratch is now supported
- New models, including *FlauBERT*, Dutch BERT, *UmBERTo*
- Revamped documentation
- First multi-modal model, MMBT from