Surafel ¦ ሱራፌል

@surafelml

PhD Candidate , working on Machine Translation for Low-Resource Languages.

Italy
Vrijeme pridruživanja: studeni 2009.

Tweetovi

Blokirali ste korisnika/cu @surafelml

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @surafelml

  1. proslijedio/la je Tweet
    31. sij

    We are delighted to inform the world that and have just submitted their PhD thesis! 👏👍 Super proud of you!

    Poništi
  2. 23. sij
    Poništi
  3. proslijedio/la je Tweet
    21. sij

    OPUS-MT (): over 1,000 pre-trained translation models and a dockerized translation server based on

    Poništi
  4. proslijedio/la je Tweet
    21. sij

    Excited to invite NLP researchers working in African Languages (or relevant NLP techniques) to submit to the workshop in Addis: "AfricaNLP - Unlocking Local Languages" 🌍 2-page extended abstracts! Deadline: 14th Feb 🔥

    Prikaži ovu nit
    Poništi
  5. 13. sij

    A Comprehensive Survey of Multilingual - Raj et al., Discusses most papers on multilingual MT, categorizing by use case, resource, modeling principles, current challenges. Further outlines, future directions for multilingual MT.

    Poništi
  6. 5. sij
    Poništi
  7. proslijedio/la je Tweet
    5. sij

    NLP Year in Review — 2019 An extensive list of interesting publications, creative and societal applications, tools and datasets, articles, and resources of 2019 by .

    Poništi
  8. proslijedio/la je Tweet
    2. sij

    The 2010s were an eventful decade for NLP! Here are ten shocking developments since 2010, and 13 papers* illustrating them, that have changed the field almost beyond recognition. (* in the spirit of and , exclusively from other groups :)).

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    2. sij

    Where I'm repeating my hopes that self-supervised learning will improve computer vision over the next year the way it has improved NLP over the last year.

    Poništi
  10. ዛሬ በዳዊት ከተማ መድኀን ተወልዶላችኋልና ፤ እርሱም ጌታ ክርስቶስ ነው። ሉቃስ ፪፡፲፩

    Poništi
  11. proslijedio/la je Tweet
    23. pro 2019.

    Excellent News in African NLP! - Zindi Challenge Winners - Nov: - & 🎉 - AfricaNLP2020 workshop: AfricaNLP - Unlocking Local Languages - Details here: 👉🏽 (1st Feb Deadline) Full story 👇🏽✨

    Poništi
  12. ማውጫው ይህን ይመስላል 👇

    Prikaži ovu nit
    Poništi
  13. In case of Machine Translation: - small data: aim at better improvement using 100, 500, 1k parallel examples - zero data: incrementally improve zero-shot inferences, no parallel examples at first - progressively grow: new translation directions from an existing model

    Poništi
  14. proslijedio/la je Tweet
    10. pro 2019.

    How to build 1000+ layer Transformers with 80+ billion parameters? By using GPipe 🙂 We will be presenting GPipe today - East Exhibition Hall B+C at poster #40 Paper > Poster and Slides > (1/4)

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    5. pro 2019.

    I'm looking for 2 PhD students in on a funded grant. Come join us in lovely Copenhagen! Deadline: Feb 1, 2020.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    4. pro 2019.

    "Now the quality of the high-resource languages is getting very close to the human level, the MT community is shifting their attention towards the tail of the distribution of languages, mid to low-resource." Read more:

    Poništi
  17. proslijedio/la je Tweet
    28. stu 2019.

    What is new in the MT world and what work remains to be done? How MT gets integrated with exciting new applications? How ML can help with content creation? Join an amazing panel of MT gurus on 5 Dec. and receive the full recording. Save your seat:

    Poništi
  18. 3. pro 2019.

    A good example of language services ” allows customers to seamlessly switch between Spanish and English " A direction that will definitely help to improve the performance of low-resourced languages.

    Poništi
  19. proslijedio/la je Tweet
    2. pro 2019.

    We are looking for an intern (Undergrad or Graduate) in Summer 2020 who speaks a less-resourced or indigenous language and would like to intern at CMU to learn/do research about how to improve NLP for it! Please apply, or contact me/ w/ questions.

    Poništi
  20. proslijedio/la je Tweet
    28. stu 2019.

    T5 by google explores the field of transfer learning in NLP. Very good systematic study on how to pretrain and transfer transformer models for downstream tasks: cc

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·