Tweetovi

Blokirali ste korisnika/cu @formiel

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @formiel

  1. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  2. 30. sij

    Our FlauBERT is now natively supported by 's transformers library. Many thanks to , and the Hugging Face team for the active technical support! Paper (new version will be available soon): Code:

    Poništi
  3. proslijedio/la je Tweet
    20. pro 2019.

    Happy to share that my internship work "Depth-adaptive Transformer" has been accepted to . TL;DR: We dynamically adjust the computation per input and match the accuracy of a baseline Transformer with only 1/4 the decoder layers.

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet

    est un nouveau modèle de langue en 🇲🇫 appris avec le supercalculateur Jean Zay du ! Il permet une recherche contextualisée. ➡️ ➡️ 🤝 et @AlexAllauze

    Poništi
  5. proslijedio/la je Tweet
    13. pro 2019.

    FlauBERT - Unsupervised Language Model Pre-training for French. The repo contains pre-trained large & small models, all the data used plus code for training & inference. It also contains FLUE, a GLUE like benchmark for French NLProc

    Poništi
  6. proslijedio/la je Tweet
    11. pro 2019.

    FlauBERT, another BERT based language model for French 🇫🇷. Comes with FLUE, a French like Glue task.🥳 Lots of score compare with CamemBERT and mBERT. Scores are not surprisingly similar to CamemBERT. working on it?

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    11. pro 2019.
    Poništi
  8. 12. pro 2019.

    Joint work with , , Vincent Segonne, , Benjamin Lecouteux, , Benoît Crabbé, , . We are grateful to and the people behind the supercomputer for letting us use these precious resources.

    Prikaži ovu nit
    Poništi
  9. 12. pro 2019.

    Our work on FlauBERT and FLUE (language models and evaluation benchmark for French) have been released today (198th birthday of Gustave Flaubert). Paper: Code and models:

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    12. pro 2019.

    Aujourd'hui, 198ème anniversaire de Gustave Flaubert. En hommage, nous publions FlauBERT, un modèle de langue pré-entraîné pour le français grâce à (Travail commun , , -- @ParisDiderot -- )

    Poništi
  11. 20. lis 2019.
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·