Shujian  

@Shujian_Liu

Sr ML engineer at , primarily working on NLP and ASR. PhD in renewable energy from . triple master.

Boston, MA
Vrijeme pridruživanja: listopad 2013.

Tweetovi

Blokirali ste korisnika/cu @Shujian_Liu

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @Shujian_Liu

  1. Prikvačeni tweet
    8. ožu 2019.

    Separated my books into programming and machine learning. Preparing for my new role.

    Ovo je potencijalno osjetljiv multimedijski sadržaj. Saznajte više
    Poništi
  2. proslijedio/la je Tweet
    4. velj

    I combined the illustrations of Transformer by Jay Alammar and code annotation by harvardnlp lab in one notebook

    Poništi
  3. proslijedio/la je Tweet

    All kinds of text classification models explained

    Poništi
  4. proslijedio/la je Tweet
    4. velj

    A Survey on Knowledge Graphs: Representation, Acquisition and Applications

    Poništi
  5. proslijedio/la je Tweet
    3. velj

    Help revolutionize the world with full self-driving by joining us at Tesla Autopilot: It is very hard to find other places where AI expertise makes as much of a difference on as big of a problem.

    Poništi
  6. proslijedio/la je Tweet
    3. velj

    How Chinese tech giants are assisting in the battle against coronavirus with AI

    Poništi
  7. proslijedio/la je Tweet
    2. velj
    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    3. velj

    Fully Quantizing a Simplified Transformer for End-to-end Speech Recognition. (arXiv:1911.03604v2 [] UPDATED)

    Poništi
  9. proslijedio/la je Tweet
    2. velj

    Wow: Google's "Meena" chatbot was trained on a full TPUv3 pod (2048 TPU cores) for **30 full days** - That's more than $1,400,000 of compute time to train this chatbot model. (! 100+ petaflops of sustained compute !)

    Poništi
  10. proslijedio/la je Tweet
    2. velj

    In the next video, I will do a demo. It’s more fun to actually do Kaggle instead of just talking about it.

    Poništi
  11. proslijedio/la je Tweet
    2. velj

    "Giving BERT a calculator" Big step in adding computational abilities to language models. Still no understanding but can certainly start to fool you. What happens if you "give BERT" thousand other similar abilities? cc

    Poništi
  12. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  13. 2. velj

    I will have a super long vacation in China.....

    Poništi
  14. 2. velj

    Google’s new patent: Speech recognition with attention-based recurrent neural networks

    Ovo je potencijalno osjetljiv multimedijski sadržaj. Saznajte više
    Poništi
  15. proslijedio/la je Tweet
    30. sij

    Check out this growing list of adopters for MPI Operator - allreduce-style distributed training on ! If your company would like to be included here, please send us a pull request!

    Poništi
  16. proslijedio/la je Tweet
    30. sij

    Our book, Trustworthy Online Controlled Experiments: A Practical Guide to A/B Testing, sharing learnings by Diane Tang from , Ya Xu from , and my own from and (I'm now at ) is available for pre-order on Amazon:

    Poništi
  17. proslijedio/la je Tweet
    31. sij 2017.
    Odgovor korisniku/ci

    "Tell me about a time when you've spent days trying to get your dependencies installed."

    Poništi
  18. proslijedio/la je Tweet
    30. sij

    People asking me to teach classes clearly give zero fuck to the imposter syndrome of a former physics PhD turned lawyer before joining AI Anyway I'll co-teach NLPL Winter School w Yoav Goldberg talking transfer learning, its limits & where the field might head Will share slides

    Poništi
  19. proslijedio/la je Tweet
    30. sij

    Highly recommend watching this 8-minute video on & the paper, with details not included the blog such as SSA vs humanlikeness correlation, sample-and-rank, removing cross-turn repetition. (Blog: )

    Poništi
  20. proslijedio/la je Tweet
    29. sij

    Does character-level embedding or subword-tokenizers actually help to handle unknown words in the language tasks?

    Poništi
  21. proslijedio/la je Tweet

    British Airways cancels all flights to and from mainland China as coronavirus spreads and governments begin evacuating citizens

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·