Tweetovi

Blokirali ste korisnika/cu @w4nderlus7

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @w4nderlus7

  1. Prikvačeni tweet
    24. srp 2019.

    Ludwig v0.2 is out! BERT, Audio / Speech, Geospatial and Temporal features, new Visualization API, integrated REST server. Enjoy!

    Poništi
  2. 31. sij

    The correct link, because as you should know by now, I can’t type...

    Prikaži ovu nit
    Poništi
  3. 31. sij

    2) the model has 2.5B parameters, 79% score. Adding a sibgle rule to avoid repetition improved about 7% the performance. What do you think twitterspere, can we say that a rule is worth 200M parameters? :) (3/3)

    Prikaži ovu nit
    Poništi
  4. 31. sij

    1) The human metric correlates with perplexity, which is pretty unexpected and important (2/3)

    Prikaži ovu nit
    Poništi
  5. 31. sij

    The paper about Meena (the latest an greatest and biggest chitchat model from Google ) was really interesting to me for two reasons: (1/3)

    Prikaži ovu nit
    Poništi
  6. 28. sij

    this is the paper I told you about at EMNLP other than PPLM ;)

    Prikaži ovu nit
    Poništi
  7. 28. sij
    Prikaži ovu nit
    Poništi
  8. 28. sij

    We just released on arXiv our work on playing text adventure games wit Go-Explore in which we achieve surprising generalization results:

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    20. sij
    Poništi
  10. 15. sij
    Poništi
  11. proslijedio/la je Tweet
    7. sij

    Is a whole presentation made with xkcd font too much?? Look forward to my first pplm talk at today. With live demo🤞

    , , i još njih 4
    Poništi
  12. proslijedio/la je Tweet
    22. pro 2019.
    Odgovor korisnicima

    We failed with declarative for ML long ago ... recently gotten one \eps used (Overton/Apple, ) similar to 's awesome Ludwig . IMO declarative helpful when many types of users and model coding not main challenge, c.f. SQL

    Poništi
  13. proslijedio/la je Tweet
    21. pro 2019.

    Looks like PPLM is on the front page of Hacker News today 🎉 congrats et al.! Thread:

    Poništi
  14. proslijedio/la je Tweet
    19. pro 2019.

    Plug and play language models accepted to . Super excited about this work! Thanks to the amazing and for mentoring and hosting me, and for NLP wisdom! Thanks to for invaluable contributions!

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    18. pro 2019.

    Introducing Generative Teaching Networks, which generate entirely synthetic data that is up to 9x faster to train on than real data!, enabling state-of-the-art Neural Architecture Search Led by w ,, & 1/

    Prikaži ovu nit
    Poništi
  16. 13. pro 2019.

    If you are interested in any elements of the power set of {graph learning, link prediction, meta learning, few-shot tasks} and you are at check out our poster at the Graph Representation Learning workshop 11:30-12:30. With

    Poništi
  17. proslijedio/la je Tweet
    6. pro 2019.

    This is a really cool work from UberAI on a tough question: Is it possible to control the generations of an unconditionally trained language model? We loved it so much that we added it to our repo and made an online demo to play with it! Give it a try👉

    Poništi
  18. 5. pro 2019.

    prompted with Dungeons and Dragons stuff and monster topic.

    Poništi
  19. proslijedio/la je Tweet
    5. pro 2019.

    PPLM, from the Uber AI team, builds on top of other large transformer-based generative models (like GPT-2), where it enables finer-grained control of attributes of the generated language (e.g. gradually switching topic 🐱 or sentiment 😃). Try it out ⤵️

    Prikaži ovu nit
    Poništi
  20. 5. pro 2019.

    You can find the code already added to the Transformers repo:

    Prikaži ovu nit
    Poništi
  21. proslijedio/la je Tweet
    5. pro 2019.

    PPLM, or how to steer the GPT-2 mammoth with a mouse. Was a super fun project to work on (and the initial foray into NLP for most of us!), led by with tireless collaborators . 👇Blog / arXiv / demo!

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·