viktor trokhymenko

@v_trokhymenko

Data Scientist/Research Scientist/Machine Learning/Deep Learning

Vrijeme pridruživanja: kolovoz 2010.

Tweetovi

Blokirali ste korisnika/cu @v_trokhymenko

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @v_trokhymenko

  1. Poništi
  2. 2. velj

    thx for the blog post btw, u forgot to add Russia BERT model == RuBERT by :muscle: :ru:

    Poništi
  3. proslijedio/la je Tweet
    1. velj

    This is a list of my top 10 book recommendations for learning the nitty-gritty of NLP and ML by

    Poništi
  4. proslijedio/la je Tweet
    25. sij

    Python 3.9.0a3 now available for testing

    Poništi
  5. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    31. sij
    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    29. sij

    model with SVM-gan penalty loss, I think, if I did it right ;)

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    30. sij

    I had another conversation with Meena just now. It's not as funny and I don't understand the first answer. But the replies to the next two questions are quite funny.

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    28. sij

    For data science projects, setting up remote computing infrastructure can be more complicated than modeling itself. Thanks and Marcel Mikl for sharing their setup for training ML models in the cloud.

    Poništi
  10. proslijedio/la je Tweet
    28. sij

    I'm happy to announce our latest work on self-supervised learning for . PASE+ is based on a multi-task approach useful for recognition. It will be presented at . paper: code: @Mila

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    28. sij

    Introducing , a 2.6B-param open-domain chatbot with near-human quality. Remarkably, we show strong correlation between perplexity & humanlikeness! Paper: Sample conversations:

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    27. sij

    New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via )

    Poništi
  13. proslijedio/la je Tweet
    27. sij

    Wow, more than 200 NLP datasets – this is gold. Made by Seen:

    Poništi
  14. proslijedio/la je Tweet
    27. sij

    [1/2] Excited to present SMART: Semi-Autoregressive Training for Conditional Masked Language Models. SMART closes the performance gap between semi- and fully-autoregressive MT models, while retaining the benefits of fast parallel decoding. With

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    24. sij

    My pet project, AVERAGE ART: averaged faces from classic portraits by art styles. Details:

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    22. sij

    Excited to share PCGrad, a super simple & effective method for multi-task learning & multi-task RL: project conflicting gradients On Meta-World MT50, PCGrad can solve *2x* more tasks than prior methods w/ Tianhe Yu, S Kumar, Gupta, ,

    Poništi
  17. proslijedio/la je Tweet
    23. kol 2019.

    Curious what people may mean when they say a neural network is (not) compositional? And how that relates to linguistics and philosophy literature on compositionality? Check our new paper on compositionality in neural networks: !

    Prikaži ovu nit
    Poništi
  18. 21. sij
    Poništi
  19. proslijedio/la je Tweet
    16. sij

    A belated blog post for our BERTology EMNLP paper (by Olga Kovaleva, Alexey Romanov, yours truly and ). My favorite experiment in this work is showing that for most GLUE tasks BERT works pretty well even *without pre-training*!

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet
    16. sij

    I did a deep-dive into Pages, and found it's possible to create a *really* easy way to host your own blog: no code, no terminal, no template syntax. I made "fast_template" to pull this together, & a guide showing beginners how to get blogging

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·