Hugging Face

@huggingface

Solving NLP one commit at a time!

NYC and Paris
Vrijeme pridruživanja: rujan 2016.

Tweetovi

Blokirali ste korisnika/cu @huggingface

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @huggingface

  1. Prikvačeni tweet
    10. sij

    Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings➡️model inputs. Welcome 🤗Tokenizers: ultra-fast & versatile tokenization led by : -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet

    Now, Thomas Wolf is going to tell us a bit about transfer learning research from

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    prije 3 sata

    is more than just public code. It's a mindset of sharing, being transparent and collaborating across organizations. It's about building on the shoulders of other projects and advancing together the state of technology (1/N) , , ,

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    prije 21 sat

    I'm impressed by the work Hugging Face is doing.

    Poništi
  5. proslijedio/la je Tweet
    2. velj
    Poništi
  6. proslijedio/la je Tweet
    31. sij

    Turkish-: Anyone interested in a Turkish BERT and wants to evaluate it on downstream tasks? I did evaluation only for UD PoS tagging - any help is really appreciated! Would really like to have a proper evaluation before adding it to the Transformers hub🤗

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    31. sij

    Thanks to for all their cool work, both in industry and academia (+Kaggle) NLP practitioners know them, mostly for Transformers. In a new release Dutch BERT is shared, thanks to & Co. That's a big step for all Dutch NLP community

    Prikaži ovu nit
    Poništi
  8. 31. sij
    Poništi
  9. proslijedio/la je Tweet
    31. sij

    CamemBERT and XLM-R Tensorflow implementations are available since the new v2.4.0 of the Transformers library from . Here are the models:

    Poništi
  10. proslijedio/la je Tweet
    30. sij

    Between (productizing NLP across Fortune2000 & global governments) & (democratizing NLP open source transformers), we've long held that NLP will move markets faster & in greater magnitude than any prior tech ever has The next decade will be defined by it

    Poništi
  11. proslijedio/la je Tweet
    31. sij

    Proud to say that our BERTje from is now available as the default Dutch BERT model in Transformers by ! Some comparisons of BERTje with mBERT, BERT-NL and RobBERT are available on (more coming soon).

    Poništi
  12. proslijedio/la je Tweet
    31. sij

    Supervised multimodal bitransformers now available in the awesome HuggingFace Transformers library!

    Poništi
  13. proslijedio/la je Tweet
    31. sij

    psyched both bc HuggingFace Transformer is going beyond text and bc this is happening with 's multimodal bitransformer

    Poništi
  14. proslijedio/la je Tweet
    30. sij

    Our FlauBERT is now natively supported by 's transformers library. Many thanks to , and the Hugging Face team for the active technical support! Paper (new version will be available soon): Code:

    Poništi
  15. 31. sij

    We now have 70+ contributed community models, which you can download and use right now! See the list here:

    Prikaži ovu nit
    Poništi
  16. 31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    31. sij

    Learn to build an interactive Transformer attention visualization based on and in under 30 minutes! We developed a minimal teaching example for our IAP class, publicly available here:

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    31. sij

    We have created umBERTo, an Italian🇮🇹Language Model trained with Whole Word Masking. The model is released for CommonCrawl and Wikipedia through the awesome Transformers library by 🤗 Code:

    Poništi
  19. proslijedio/la je Tweet
    30. sij

    Our FlauBERT (French BERT) models have now been integrated into official library with 4 below configurations !

    Poništi
  20. proslijedio/la je Tweet

    portfolio company CEO sits w/ to discuss why Natural Language Processing is the most important field of Machine Learning & how they’ve created the world’s most widely adopted open-source NLP library

    Poništi
  21. proslijedio/la je Tweet

    Thanks for the invitation to talk about why I believe NLP is the most important field of Machine Learning. Who doesn't agree with me?

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·