Graham Neubig

@gneubig

Assistant professor at CMU, studying natural language processing, machine learning, etc. Japanese account is .

Vrijeme pridruživanja: rujan 2010.

Tweetovi

Blokirali ste korisnika/cu @gneubig

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @gneubig

  1. proslijedio/la je Tweet

    We’re co-organizing a task on simultaneous speech translation as part of an workshop on spoken language translation. Entrants will be evaluated on real-time translation of TED talks from English into German. Learn more:

    Poništi
  2. proslijedio/la je Tweet
    3. velj

    Interested in translation, and how to handle disfluent conversational speech with current models? Check out our Conversational Speech Translation task! We have a constrained track and tracks for speech and text translation to be approachable to all! 🙂

    Prikaži ovu nit
    Poništi
  3. 4. velj

    Hey, it's like the paper submission system going down right before the conference deadline, but for politics!

    Poništi
  4. proslijedio/la je Tweet

    This week's LTI Colloquium is from on "Towards Story Generation"! Come learn how to train models that can efficiently generate more coherent and stylistically consistent stories:

    Poništi
  5. 20. sij

    PSA: the abbreviation "cf." means "in contrast to" not "see" . (cf. the majority of NLP papers that use "cf." recently)

    Poništi
  6. 17. sij

    This is excellent, I don't have any other words. Strong commitment, detailed plan. I hope every other tech company follows suit (actually every company, but one step at a time).

    Poništi
  7. 15. sij

    I've started to upload the videos for the Neural Nets for NLP class here: We'll be uploading the videos regularly throughout the rest of the semester, so please follow the playlist if you're interested.

    Poništi
  8. 14. sij

    There are links to examples of papers using each concept. Thanks to , who co-developed the typology, and and others who gave some early advice. We welcome contributions! Please help us annotate papers (your papers or others), or improve the classifiers! 3/3

    Prikaži ovu nit
    Poništi
  9. 14. sij

    We've done empirical counts of the frequency of concepts, through manual annotation of 40 papers, and a (naive) rule-base classifier over all papers from ACL/NAACL/EMNLP 2019. The ones at the top you should probably know, the ones at the bottom may be useful in some cases. 2/3

    Prikaži ovu nit
    Poništi
  10. 14. sij

    Happy to release NN4NLP-concepts! It's a typology of important concepts that you should know to implement SOTA NLP models using neural nets: 1/3 We'll reference this in CMU CS11-747 this year, trying to maximize coverage. 1/3

    Prikaži ovu nit
    Poništi
  11. 13. sij

    Videos and code examples will be available! We also have a new project that we'll unveil soon, maybe tomorrow. Check this space 😀2/2

    Prikaži ovu nit
    Poništi
  12. 13. sij

    2020 edition of CMU CS11-747 "Neural Networks for NLP", is starting tomorrow! We (co-teacher and 6 wonderful TAs) restructured it a bit to be more focused on "core concepts" used across a wide variety of applications. 1/2

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    12. sij

    We released an elaborate and exhaustive paper list for the task, covering papers from top-conferences and -years. Each paper has been annotated with rich information. Thanks for 's contribution and 's concept idea.

    Poništi
  14. 9. sij

    Cool winter morning sunrise on the UPitt Cathedral of Learning from the campus 😀

    Poništi
  15. 27. pro 2019.

    The experiments here were really informative to me; I now understand well why (and when) knowledge distillation works in sequence generation: it creates "easy-to-learn" data with more lexical consistency and less reordering. Also: weak child models prefer weaker teacher models.

    Poništi
  16. proslijedio/la je Tweet

    Not all training data are equal, but how to identify the good data efficiently at different stage of model training? We propose to train a data selection agent by up-weighting data that has similar gradient with the gradient of the dev set:

    Prikaži ovu nit
    Poništi
  17. 21. pro 2019.

    The code for hosting papers on the excellent is open source and free to use: Does anyone want to do a global search and replace of ACL to ACM to show how they could easily host all their papers open access for virtually free?

    Poništi
  18. proslijedio/la je Tweet
    20. pro 2019.

    We're pleased to announce the 2020 Duolingo Shared Task on Translation + Paraphrase, to be co-located at this summer in Seattle! More details and data releases to come in January... stay tuned!

    Poništi
  19. proslijedio/la je Tweet

    🙌 IT’S FINALLY OFFICIAL!! The UN has declared the Decade of Indigenous Languages, 2022-2032!! Let’s keep this work going strong, and ensure that the coming decades see a world of vibrant, vital Indigenous languages! ➡️

    Poništi
  20. proslijedio/la je Tweet
    17. pro 2019.

    NLP internship / PhD opportunities for speakers of low-resource languages (particularly African 🌍): - PhD w/ : - Internship w/ : Do you know of other open positions? Reply with the link.

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·