Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @formiel
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @formiel
-
Hang Le proslijedio/la je Tweet
Transformers 2.4.0 is out
- Training transformers from scratch is now supported
- New models, including *FlauBERT*, Dutch BERT, *UmBERTo*
- Revamped documentation
- First multi-modal model, MMBT from @facebookai, text & images Bye bye Python 2
https://github.com/huggingface/transformers/releases …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Our FlauBERT is now natively supported by
@huggingface's transformers library. Many thanks to@julien_c,@LysandreJik and the Hugging Face team for the active technical support! Paper (new version will be available soon): https://arxiv.org/abs/1912.05372 Code: https://github.com/getalp/Flaubert https://twitter.com/laurent_besacie/status/1222928525499490306 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
Happy to share that my
@facebookai internship work "Depth-adaptive Transformer" has been accepted to#ICLR2020 . TL;DR: We dynamically adjust the computation per input and match the accuracy of a baseline Transformer with only 1/4 the decoder layers.https://openreview.net/forum?id=SJg7KhVKPH …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
#FlauBERT est un nouveau modèle de langue en
appris avec le
supercalculateur Jean Zay du @CNRS#IA#deeplearning ! Il permet une recherche contextualisée.
https://github.com/getalp/Flaubert
https://arxiv.org/abs/1912.05372
@laurent_besacie@didier_schwab et @AlexAllauze@Genci_frpic.twitter.com/Ys47Nz9gVD
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
FlauBERT - Unsupervised Language Model Pre-training for French. The repo contains pre-trained large & small models, all the data used plus code for training & inference. It also contains FLUE, a GLUE like benchmark for French NLProc https://arxiv.org/abs/1912.05372 https://github.com/getalp/Flaubert
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
FlauBERT, another BERT based language model for French
.
Comes with FLUE, a French like Glue task.
Lots of score compare with CamemBERT and mBERT. Scores are not surprisingly similar to CamemBERT.
@huggingface working on it?@zehavoc@ParisMLgroup https://twitter.com/arxiv_cs_cl/status/1204943594051776514 …pic.twitter.com/EvFmOvIa3m
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
FlauBERT: Unsupervised Language Model Pre-training for French pdf: https://arxiv.org/pdf/1912.05020.pdf … abs: https://arxiv.org/abs/1912.05020 github: https://github.com/getalp/Flaubert pic.twitter.com/hpo3JqJ2Y5
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Joint work with
@_Loic_Vial,@jibrilfrej1, Vincent Segonne,@max_i_min, Benjamin Lecouteux,@AlexAllauzen, Benoît Crabbé,@laurent_besacie,@didier_schwab. We are grateful to@Genci_fr and the people behind the#JeanZAY supercomputer for letting us use these precious resources.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Our work on FlauBERT and FLUE (language models and evaluation benchmark for French) have been released today (198th birthday of Gustave Flaubert).
#Flaubert Paper: https://arxiv.org/abs/1912.05372 Code and models: https://github.com/getalp/Flaubert https://twitter.com/laurent_besacie/status/1205062201687642112 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hang Le proslijedio/la je Tweet
Aujourd'hui, 198ème anniversaire de Gustave Flaubert. En hommage, nous publions FlauBERT, un modèle de langue pré-entraîné pour le français grâce à
@Genci_fr (Travail commun@UGrenobleAlpes,@cnrs,@LIGLab -- @ParisDiderot --@ESPCI_Paris@cnrs@psl_univ) https://arxiv.org/abs/1912.05372 pic.twitter.com/sx1mU4S2wB
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Making Transformer networks simpler and more efficienthttps://ai.facebook.com/blog/making-transformer-networks-simpler-and-more-efficient/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.