Tweetovi

Blokirali ste korisnika/cu @XiangZhou14

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @XiangZhou14

  1. proslijedio/la je Tweet
    31. sij

    An Opinionated Guide to ML Research: “To make breakthroughs with idea-driven research, you need to develop an exceptionally deep understanding of your subject, and a perspective that diverges from the rest of the community—some can do it, but it’s hard.”

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    11. sij

    Here's a thread surveying some 'classic' work on . Lots of people seem to be discussing this right now, but with partial references to the whole story. My aim is to highlight some of the philosophical and psychological issues in the history of the concept. 1/

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    14. pro 2019.

    these are some of my nlp wishlists for 2020. what are yours?

    Poništi
  4. proslijedio/la je Tweet
    9. pro 2019.

    the kid on the bike claims sota, the one with the goggles is analyzing the model, and nobody is moving forward

    Tweet je nedostupan.
    Poništi
  5. proslijedio/la je Tweet

    we often under-teach the process (vs the techniques) of doing research. here is how I currently advise my PhD students. work in progress.

    Poništi
  6. proslijedio/la je Tweet
    12. stu 2019.

    I'm starting a professorship in the CS department at UNC in fall 2020 (!!) and am hiring students! If you're interested in doing a PhD please get in touch. More info here:

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    8. stu 2019.

    New tech report with Junghyun Min and : "BERTs of a feather do not generalize together" Across 100 re-runs, BERT fine-tuned on MNLI has a consistent score on MNLI but extreme variation in syntactic generalization (measured w/ HANS). Link: 1/7

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    23. lis 2019.

    New paper! We perform a systematic study of transfer learning for NLP using a unified text-to-text model, then push the limits to achieve SoTA on GLUE, SuperGLUE, CNN/DM, and SQuAD. Paper: Code/models/data/etc: Summary ⬇️ (1/14)

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    19. lis 2019.

    Repeat 1-3 for 6 years, or until PhD is cooked through: 1. "What an interesting problem! Surely the model will learn intricate patterns..." 2. Machine learning derives spurious solution unrelated to underlying phenomenon 3. Tweak for 6+ months; remain faithful to original goal

    Poništi
  10. proslijedio/la je Tweet
    30. ruj 2019.

    Who said that training GPT-2 or BERT was expensive? "We use 512 Nvidia V100 GPUs [...] Upon the submission of this paper, training has lasted for three months [...] and perplexity on the development set is still dropping."

    Poništi
  11. proslijedio/la je Tweet
    23. ruj 2019.

    New paper with Phil Blunsom and showing that regular LSTMs can "learn syntax" as well as the Ordered Neurons LSTMs of Shen et al. (ICLR 2019) ... but that's only because the "PRPN" parsing algorithm is biased. 1/2

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    16. ruj 2019.

    Our @emnlp2019 paper "Addressing Semantic Drift in QG for Semi-Supv QA" (w. ): (1) we improve QG via 2 semantic-rewards (Ques-Paraphr + QA-Prob) (2) propose QA-Eval for QG as NLG metric (3) augment QA datasets by generating ques from existing/new articles.😀1/2

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    20. kol 2019.

    Presenting LXMERT at @EMNLP2019 --> (prnc. 'leksmert'). Top3 in GQA & VQA challenges (May2019), Rank1 in VizWiz, & v.strong generalzn to NLVR2 (22% abs jump)! Awesome effort by ! CODE+MODELS all public: ; pls use+share! 1/2

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    8. kol 2019.

    Wow. Dan Povey is leaving Hopkins

    Poništi
  15. proslijedio/la je Tweet
    10. srp 2019.

    Do you ever have a model that uses and one that uses , and you want to combine the two for end-2-end training without rewriting either? TfPyTh allows you to plug one into the other while propagating gradients for training 🎉 Code 👉

    Poništi
  16. proslijedio/la je Tweet
    20. lip 2019.
    Odgovor korisniku/ci

    Think of it differently: the bigger models + bigger data trajectory is known, kinda boring, and will be done anyways at big-cos, who have plenty of capable people already. so what's your innovation potential there?

    Poništi
  17. proslijedio/la je Tweet
    1. lip 2019.

    Folks at , we have multiple POSTDOC opportunities in ! Pls spread the word and ping me to chat about our several new projects/collaborations/hires 😀 Also, check out 3 talks by students/collaborators & SpLU-RoboNLP workshop (see next tweet):

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    30. svi 2019.

    Overfitting in machine learning due to data set reuse turned out to be less of a problem than feared. There are at least four pieces to the puzzle that explain why. Thread.

    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet

    Introducing Remote Development for 🚀💻🛰️ A new set of extensions that enable you to open any folder in a container, on a remote machine, or in the Windows Subsystem for Linux (WSL) and take advantage of VS Code's full feature set. 👉

    Poništi
  20. proslijedio/la je Tweet

    HUGE NEWS: “For the first time, this study demonstrates that we can generate entire spoken sentences based on an individual’s brain activity."

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·