Victor Sanh

@SanhEstPasMoi

Dog sitter by day, Scientist at 🤗 by night | Into , started with | Pun Enthusiast |

Parisian in New York, USA
Vrijeme pridruživanja: svibanj 2012.

Tweetovi

Blokirali ste korisnika/cu @SanhEstPasMoi

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @SanhEstPasMoi

  1. Prikvačeni tweet
    17. pro 2019.

    🔥🔥Series A!!🔥🔥 Extremely excited to share the news with you and so in awe of what we have built with the community over the past few months!! 🧡 We really are JUST GETTING STARTED!!🚀 Also, we are hiring!!

    Poništi
  2. proslijedio/la je Tweet
    prije 15 sati

    I just published my first post on Medium "Is the future of Neural Networks Sparse?" . Enjoy!

    Poništi
  3. 3. velj

    Thanks for having us and the interesting discussion! We were honored to be guests on arguably the best ML/NLP podcast out there!

    Poništi
  4. proslijedio/la je Tweet
    3. velj

    It's not OK to claim that a system has "near-human abilities" while just releasing a few handpicked examples. We fully release "very-far-from-human abilities" systems every month. If your system is "near-human" it can certainly be shared.

    Poništi
  5. proslijedio/la je Tweet
    2. velj

    I'm impressed by the work Hugging Face is doing.

    Poništi
  6. proslijedio/la je Tweet
    31. sij

    psyched both bc HuggingFace Transformer is going beyond text and bc this is happening with 's multimodal bitransformer

    Poništi
  7. proslijedio/la je Tweet
    31. sij

    Supervised multimodal bitransformers now available in the awesome HuggingFace Transformers library!

    Poništi
  8. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    28. sij
    Poništi
  10. 14. sij

    Some more: - Fill - Feel - Fork - Fuck - Affect - Effect - Hill - Heel - Dill - Deal

    Prikaži ovu nit
    Poništi
  11. 14. sij

    Differences I’m having a hard time pronouncing as a non-native speaker, a sample: - Bowl - Ball - This is - Decease - Beer - Bear - Poor - Pour - Beach - Bitch - Bold - Bald - Sheet - Shit - Peace - Piss - Piece Just imagine all the embarrassing situations I can put myself in…

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    14. sij

    I often meet research scientists interested in open-sourcing their code/research and asking for advice. Here is a thread for you. First: why should you open-source models along with your paper? Because science is a virtuous circle of knowledge sharing not a zero-sum competition

    Prikaži ovu nit
    Poništi
  13. 13. sij

    Dogs are overrated, I am adopting a parrot.

    Poništi
  14. proslijedio/la je Tweet
    13. sij

    🔥 Introducing Tokenizers: ultra-fast, extensible tokenization for state-of-the-art NLP 🔥 ➡️

    , , i još njih 3
    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    10. sij

    Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings➡️model inputs. Welcome 🤗Tokenizers: ultra-fast & versatile tokenization led by : -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...

    Prikaži ovu nit
    Poništi
  16. 6. sij

    At the Paris office. Who’s there? – mjesto: Station F

    Poništi
  17. proslijedio/la je Tweet
    20. pro 2019.

    Exciting news 🤗 Last week I accepted an offer as a Research Engineer with in Brooklyn! I’ve been blown away by the team’s rapid iteration and high impact work in NLP and am looking forward to working alongside such a talented team!

    Poništi
  18. proslijedio/la je Tweet
    20. pro 2019.

    We spend our time finetuning models on tasks like text classif, NER or question answering. Yet 🤗Transformers had no simple way to let users try these fine-tuned models. Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.

    , , i još njih 4
    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet
    19. pro 2019.

    📝📝📝On-Device Machine Learning: Text Generation on 📝📝📝 Combine the power of GPT-2, and to bring state-of-the-art NLP on mobile! by via

    Poništi
  20. proslijedio/la je Tweet
    18. pro 2019.

    🔥🔥 Series A!! 🔥🔥 Solving Natural language is going to be the biggest achievement of our lifetime, and is the best proxy for Artificial intelligence. Not one company, even the Tech Titans, will be able to do it by itself – the only way we'll achieve this is working together

    , , i još njih 6
    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·