Tweetovi

Blokirali ste korisnika/cu @HaseoX94

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @HaseoX94

  1. Prikvačeni tweet
    8. velj 2019.

    Making some headway in porting the PyTorch "Neural Ordinary Differential Equations" codebase to TF Eager execution. This graph below replicates the simplest example in their codebase, on TF 1.12 with Eager Execution enabled, with dopri5.

    Prikaži ovu nit
    Poništi
  2. 1. velj

    Finally finished porting Single Headed Attention RNNs to Tensorflow 2.0. I just couldn't make the time till now but satisfied that I can use it in my projects now. Fixing the last of the autograph tracing issues will have to wait.

    Poništi
  3. 1. velj

    So many reasons to actually use Julia instead of re-learning it for the 6th time. And that speed difference between Julia and Torchdiffeq is far too extreme. Side note, even tfdiffeq takes about the same as pytorch, last I checked.

    Poništi
  4. proslijedio/la je Tweet
    9. sij

    Training Neural SDEs: We worked out how to do scalable reverse-mode autodiff for stochastic differential equations. This lets us fit SDEs defined by neural nets with black-box adaptive higher-order solvers. With , and .

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    7. sij

    I completed my 1st data science project ~30 years ago. Since then I've been continuously developing a questionnaire I use for all new data projects, to ensure the right info is available from the start. I'm sharing it publicly today for the first time.

    Poništi
  6. 1. sij

    Happy New Year to everyone !

    Poništi
  7. 23. pro 2019.

    This is so much better than cloning a blank notebook

    Poništi
  8. proslijedio/la je Tweet
    20. pro 2019.

    Yes! I got my first big conference paper accepted at ICLR, with spotlight! We improve the previous DeepMind paper "NALU" by 3x-20x. – This took 7-8 months, working without any funding as an independent researcher. Paper: Code:

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    10. pro 2019.

    Found this gem @ ! Using orthogonal property of Legendre polynomials, Legendre Memory Units (LMU) can efficiently handle temporal dependencies spanning 100k timesteps, converge rapidly and use fewer internal state-variables compares to LSTMs.

    Prikaži ovu nit
    Poništi
  10. 3. pro 2019.

    I feel the same regarding tools. Learn both. Use PyTorch for Research. Once research phase is done, use Tensorflow to deploy or vice-versa. Or do both in Tensorflow. Or both in PyTorch. Frameworks have converged enough to allow flexible modelling and deployment.

    Poništi
  11. 27. stu 2019.

    A fun start to the day, and something to look forward to extending to other tasks than language modelling.

    Poništi
  12. 24. stu 2019.

    This is invaluable. Will go through this over the week.

    Poništi
  13. proslijedio/la je Tweet
    21. stu 2019.

    To help developers get started with PyTorch, we’re making the 'Deep Learning with PyTorch' book, written by Luca Antiga and Eli Stevens, available for free to the community:

    Poništi
  14. 19. stu 2019.

    ... Well. I'm receiving my Stadia today, hope it's not as bad as it looks here. If it is, hope it gets better eventually.

    Poništi
  15. proslijedio/la je Tweet
    18. stu 2019.

    *New paper* RandAugment: a new data augmentation. Better & simpler than AutoAugment. Main idea is to select transformations at random, and tune their magnitude. It achieves 85.0% top-1 on ImageNet. Paper: Code:

    Poništi
  16. 12. stu 2019.

    Oh but the amount of compute and data. Oh boy.

    Prikaži ovu nit
    Poništi
  17. 12. stu 2019.

    A semi-simple method that I will probably try soon.

    Prikaži ovu nit
    Poništi
  18. 11. stu 2019.

    Wow. That's gonna increase my use of colab even more at this point.

    Poništi
  19. proslijedio/la je Tweet
    9. stu 2019.

    1/ A friend of mine pointed me to this article with a bunch of really cool images from Japan in 1908. So naturally I ran them through my new DeOldify model. Enjoy!

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet
    4. stu 2019.

    Computing Receptive Fields of Convolutional Neural Networks -- A new Distill article by André Araujo, Wade Norris, and Jack Sim.

    Prikaži ovu nit
    Poništi
  21. proslijedio/la je Tweet
    30. lis 2019.
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·