Lysandre

@LysandreJik

ML engineer at Hugging Face. Co-maintainer of 🤗/Transformers

Chinatown, NYC
Vrijeme pridruživanja: ožujak 2019.

Tweetovi

Blokirali ste korisnika/cu @LysandreJik

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @LysandreJik

  1. Prikvačeni tweet
    18. lis 2019.

    We've done a few benchmarks to our 🤗/Transformers library! we compare 10 architectures for inference, with PyTorch and TensorFlow. We further benchmark them with TorchScript and XLA! The results are detailed in the first of a series of blog posts:

    Poništi
  2. proslijedio/la je Tweet
    prije 6 sati

    Hi world! I'm excited to start working in Paris as a research engineer this week :)

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet

    Facebook AI's latest research on multimodal bitransformers is the first multimodal model to be part of the library! Released here:

    Poništi
  4. proslijedio/la je Tweet
    prije 8 sati

    Hello Twitterverse 👋🏻 Started my second day as Chief of Staff at 🤗 & I'm so excited to delve into the world of NLP. My favorite use case of Transformers 🤖:

    Poništi
  5. proslijedio/la je Tweet
    prije 9 sati

    A few people asked us to talk more about our model sharing feature, so here goes:

    , , i još njih 5
    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    4. velj

    Guest blog post from angel investor on sparse tensors in neural nets. I am VERY excited by what's going to happen in sparsity-land this year 🔥

    Poništi
  7. proslijedio/la je Tweet
    3. velj

    The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors. 🔥

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    2. velj

    I'm impressed by the work Hugging Face is doing.

    Poništi
  9. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet

    Thanks for the invitation to talk about why I believe NLP is the most important field of Machine Learning. Who doesn't agree with me?

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    28. sij

    Do you speak French? 🥖 If you do, you should watch the interesting discussion we had with the awesome on "Podcast IA" – sharing this, even though it's always weird to listen to one's voice on video 😅😅

    Poništi
  12. proslijedio/la je Tweet
    21. sij

    Let me highlight this amazing work I've read recently on in NLP, in which you'll find both: - a deep discussion of what it means for a neural model to be compositional - a deep and insightful comparison of LSTM, ConvNet & Transformers! 👉

    Poništi
  13. proslijedio/la je Tweet
    16. sij

    As you may know, we just raised our series A. It wouldn't have been possible without the help and contribution of the fantastic community that we serve. So join us next week to celebrate in NYC around craft beers & classic video games! 🤗🤗🤗

    Poništi
  14. proslijedio/la je Tweet
    10. sij

    Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings➡️model inputs. Welcome 🤗Tokenizers: ultra-fast & versatile tokenization led by : -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    6. sij

    At the Paris office. Who’s there? – mjesto: Station F

    Poništi
  16. proslijedio/la je Tweet
    3. sij
    Poništi
  17. proslijedio/la je Tweet
    1. sij

    It's January 1st, which means... 🎊🎉 we can FINALLY leave Python 2 behind!! 🎉🎊

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet

    So awesome to see the community effort on multi-lingual NLP! In the past month, models in Dutch, Finnish, French, German, Italian, Japanese, Mandarin and Spanish have been added to transformers. Check them all here:

    Poništi
  19. proslijedio/la je Tweet
    29. pro 2019.

    ┏━━┓┏━━┓┏━━┓┏━━┓ ┗━┓┃┃┏┓┃┗━┓┃┃┏┓┃ ┏━┛┃┃┃┃┃┏━┛┃┃┃┃┃ Solving Natural Language Processing! ┃┏━┛┃┃┃┃┃┏━┛┃┃┃┃ ┃┗━┓┃┗┛┃┃┗━┓┃┗┛┃ ┗━━┛┗━━┛┗━━┛┗━━┛

    Poništi
  20. proslijedio/la je Tweet
    27. pro 2019.

    If you're using Transformers from source, we've rolled out 2 nice beta features (TBR in January) 💥Ultra-fast Bert/GPT2 tokenizers (up to 80x faster) 🦄Easy/versatile sequence generation for generative models: top-k/nucleus/temperature sampling, penalized/greedy, beam search...

    Poništi
  21. proslijedio/la je Tweet
    20. pro 2019.

    We spend our time finetuning models on tasks like text classif, NER or question answering. Yet 🤗Transformers had no simple way to let users try these fine-tuned models. Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.

    , , i još njih 4
    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·