Jules Gagnon-Marchand

@julesgm4

Master's Student at Deep learning for NLP, Previously at Google AI, at Google Brain next summer.

Vrijeme pridruživanja: ožujak 2010.

Tweetovi

Blokirali ste korisnika/cu @julesgm4

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @julesgm4

  1. proslijedio/la je Tweet

    Donald Trump must be convicted and removed from office. Because he will always choose his own personal interest over our national interest. Because in America, right matters. Truth matters. If not, no Constitution can protect us. If not, we are lost.

    Poništi
  2. 16. sij

    O(L log L) self attention instead of O(L^2) is likely a pretty significant improvement

    Poništi
  3. proslijedio/la je Tweet

    Russians appear to be re-running their 2016 hacking playbook, once again to benefit Donald Trump. Will the media play along again? Will the GOP open the door again? Will the Russians help pick our POTUS again?

    Poništi
  4. proslijedio/la je Tweet
    3. sij

    The effect can now handle collisions and multiple photos

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    1. sij

    We present our new year special: “oLMpics - On what Language Model pre-training captures״, , Exploring what symbolic reasoning skills are learned from an LM objective. We introduce 8 oLMpic games and controls for disentangling pre-training from fine-tuning.

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    25. pro 2019.

    Secrets computer developers don’t want you to know

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    15. stu 2019.

    Comedy is an NP problem: it's easy to tell if a joke is funny, but quite difficult to write one. And that's why I hope we can prove that P = NP

    Poništi
  8. proslijedio/la je Tweet
    12. stu 2019.

    Canada: develops national AI strategy to attract and *retain* top researchers Also Canada: refuses top AI researchers visas for conferences because they *might not leave after* Rest of world: 🧐

    Poništi
  9. proslijedio/la je Tweet
    8. lis 2019.

    What happens if you roll a circle inside a circle that’s 4, 3 and 2 times as big? You get an astroid, a deltoid and... A straight line!

    Poništi
  10. proslijedio/la je Tweet
    26. ruj 2019.

    I'm still looking through the enormous number of interesting NLP submissions to , but I'm really excited to see *two* new pretraining methods that outperform XLNet/RoBERTa on NLU tasks with far fewer parameters/FLOPS:

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    Odgovor korisniku/ci

    "Distilling the Knowledge in a Neural Network" by Hinton, Dean and myself : (

    Poništi
  12. proslijedio/la je Tweet
    17. ruj 2019.

    Trying depth-mapping on abstract art. I don't know what you would call 'working properly' here, it's just fascinating to see a neural network try to interpret a Jackson Pollock painting as a 3D landscape. (Convergence, 1952)

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet

    I finally got CTRL working and generating text, and uh

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    4. ruj 2019.

    You do great work. To some people you actually do magic. Value your own skills and contributions. Thanks for coming to my TED talk.

    Poništi
  15. proslijedio/la je Tweet
    5. ruj 2019.

    This is SO meta 🤓 We trained a generative language model on a dataset of ArXiv NLP papers. You can now get a neural net to write your papers for (with?) you 🔥. We heard from a few researchers that they're already using it in submitted papers.

    , , i još njih 6
    Poništi
  16. proslijedio/la je Tweet
    31. kol 2019.
    Odgovor korisniku/ci

    Here’s a short summary, but doesn’t include the past few weeks. For a more in depth summary the Wikipedia entry is okay.

    Poništi
  17. proslijedio/la je Tweet
    27. lip 2019.

    the car was too hot

    Poništi
  18. proslijedio/la je Tweet
    3. lip 2019.

    3.8 news: Today, code was checked in that substantially sped-up global lookups and builtin lookups. They are still slower than accessing locals and non-locals but only modestly so. If all goes well, you all should have a beta release tomorrow :-)

    Poništi
  19. proslijedio/la je Tweet
    17. tra 2019.

    One of my students discovered this using word2vec: good - bad = excellent, bad - good = ['maniacal_killer', 'insuring_repackaged_subprime_mortages']

    Poništi
  20. proslijedio/la je Tweet
    8. tra 2019.

    Congrats to and from NYU and Wei Yang and from University of Waterloo on their new BERTer Indexing Model. It is now the SOTA on the MSMARO Passage Ranking Task with a MRR of .368! Paper and code to follow shortly.

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·