Zhiting Hu

@ZhitingHu

PhD student at Machine Learning Department, Carnegie Mellon University

Vrijeme pridruživanja: travanj 2018.

Tweetovi

Blokirali ste korisnika/cu @ZhitingHu

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ZhitingHu

  1. 2. velj
    Poništi
  2. proslijedio/la je Tweet

    Saturday, February 8, 2020 - 10:45 AM - 12:30 PM SA6Q: Modularizing Natural Language Processing Zhengzhong Liu, Zhiting Hu, and Eric Xing

    Zhengzhong Liu, Zhiting Hu, and Eric Xing portrayed.
    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    25. pro 2019.

    Video & slides for LIRE workshop @ are now up: Check out the Talks and Panel by Jeff Bilmes Tom Griffiths & more. Thanks to all speakers & presenters for making the workshop a success!

    , , i još njih 3
    Poništi
  4. 13. pro 2019.

    DON’T MISS the exciting panel by our fantastic speakers, 17:05@West 208+209!

    Poništi
  5. proslijedio/la je Tweet
    13. pro 2019.

    Now at Learning with Rich Experience: Integration of Learning Paradigms room 208

    Poništi
  6. proslijedio/la je Tweet
    13. pro 2019.
    Poništi
  7. 13. pro 2019.

    Come join the workshop on Learning with Rich Experience. Note the location: West 208+209. Look fwd to the super exciting talks by JeffBilmes & TomGriffiths, and the contributed presentations:

    , , i još njih 5
    Poništi
  8. proslijedio/la je Tweet

    The is very pleased to announce , UPMC Professor of Computer Science at & Director of Research at , as a shortlisted nominee for 2019 🏅 Cast your vote for Ruslan here →

    Poništi
  9. proslijedio/la je Tweet
    22. stu 2019.

    Code release for paper on Learning Data Manipulation: Learning to augment and re-weight data in low data regime or in presence of imbalanced labels. Code: via & Bowen Tan.

    Poništi
  10. proslijedio/la je Tweet

    Professor Russ Salakhutdinov , at , is a nominee for the 2019 🏅. Help us by casting your vote at

    Poništi
  11. 4. stu 2019.
    Poništi
  12. 31. lis 2019.
    Prikaži ovu nit
    Poništi
  13. 31. lis 2019.

    The paper “transfers” an off-the-shelf _reward_ learning algorithm to learning _data_ manipulation It’s a powerful idea--transferring solutions to problems in one context to problems in another. Used in learning structured knowledge , improving GANs/VAEs

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    30. lis 2019.

    paper on Learning Data Manipulation: Learning to augment and re-weight data for improved training, especially in low data regime or in presence of imbalanced labels. w/t Zhiting Hu, Bowen Tan et. al.

    Poništi
  15. proslijedio/la je Tweet
    17. lis 2019.

    Introducing Texar-PyTorch: An open-source library integrating the best of into

    Poništi
  16. 17. lis 2019.

    Read more about Texar-PyTorch features & how easily you can customize any of the above modules for your project, either you're an ML novice or expert More resources: 5/5

    Prikaži ovu nit
    Poništi
  17. 17. lis 2019.

    On training part, Texar-PyTorch replicates high-level APIs of tf.Estimator and tf.keras.Model, but with greater flexibility + TensorBoard + hyperparameter tuning APIs 4/5

    Prikaži ovu nit
    Poništi
  18. 17. lis 2019.

    On data part, Texar-PyTorch replicates best practice of for easy processing, batching, iterating + efficiency w/ buffered shuffling, caching, lazy-loading. It also replicates TFRecord to ingest arbitrary complex data dataset, eg, image+caption+label 3/5

    Prikaži ovu nit
    Poništi
  19. 17. lis 2019.

    On model part, Texar-PyTorch replicates abundant TF modules & utils, including the excellent text generation ones. See the list of Texar modules (selected): 2/5

    Prikaži ovu nit
    Poništi
  20. 17. lis 2019.

    Super excited to release Texar-PyTorch v0.1 An ML library integrating the best of TensorFlow into PyTorch - replicating many useful TF modules & designs to enhance PyTorch, incl. data, model & training. See how Texar-Pytorch builds a Conditional-GPT2 1/5

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·