Akihiro Matsukawa

@amatsukawa

ML Engineer. Formerly , , . Amateur Netflix watcher and hotpot eater. Opinions are my own, etc.

San Francisco, CA
Vrijeme pridruživanja: listopad 2010.

Tweetovi

Blokirali ste korisnika/cu @amatsukawa

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @amatsukawa

  1. proslijedio/la je Tweet
    26. sij

    Continuing to move the game forward . Much respect my brother 💪🏾 #33644

    Poništi
  2. 27. sij

    I ran into every gym as a teenager, screamed "3-2-1-Kobe", and took shots I couldn't possibly make. Honestly, it never even crossed my mind the man could die. I mean, I knew he *could* like all of us. But to me, Kobe was larger than life.

    Poništi
  3. 12. sij

    I wish Twitter would use some DL to build a feature to let me filter out specific topics, like all this “what is DL” debate.

    Poništi
  4. 2. sij

    Want to go more plant-based for 2020, but I *like* meat. Besides tofu and beyond-meat, I haven't found any good replacements. Eating lentils or beans is just sad (and I don't like them even at restaurants, so it's not my preparation). Any recommendations?

    Poništi
  5. 8. pro 2019.

    Decided not to attend NeurIPS this year, having FOMO now that I’m starting to see all the tweets roll in.

    Poništi
  6. 23. stu 2019.

    ML Twitter, pytorch ques: xs = [ ... ] # a list of torch.Tensors modules = [ ... ] # a list of nn.Modules ys = [ m(x) for m, x in zip(modules, xs)] In TF, those forward passes would in in parallel. How do I make them parallel in pytorch? jit?

    Poništi
  7. proslijedio/la je Tweet
    4. stu 2019.

    Here's a thread describing my plans for Gradient. It's the same information that's on the website, but this feels better suited to discussion.

    Poništi
  8. 26. lis 2019.

    Communicating results accurately is very important, but I don’t see the point of debating words used to describe the results.

    Prikaži ovu nit
    Poništi
  9. 26. lis 2019.

    We seem hung up on debating the meaning of words like “solved”, “supremacy” and “understanding” on the basis of PR. Who cares? I doubt this is a surprise to *anyone* working in ML.

    Prikaži ovu nit
    Poništi
  10. 25. lis 2019.

    Oh, nvm they explain in response to a question the pytorch witchcraft just chains the Julia AD with the pytorch AD.

    Prikaži ovu nit
    Poništi
  11. 25. lis 2019.

    They show differentiating through a self-driving car simulator, diff eq solver, even through a pytorch model called from Julia (I guess because it can differentiate through LLVM code?).

    Prikaži ovu nit
    Poništi
  12. 25. lis 2019.

    I just found out about in . The idea of lowering the tracing needed to do AD into the language compiler so potentially anything written in that language can be differentiated is mind blowing.

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    24. lis 2019.
    Tweet je nedostupan.
    Poništi
  14. proslijedio/la je Tweet
    21. lis 2019.
    Poništi
  15. proslijedio/la je Tweet
    18. lis 2019.

    “What is missed in [lifehack books like the ‘4 Hour Work Week’] is the mindset of craftsmanship; that one’s expertise and deliberate focus on one’s craft is actually the primary driver for success and not some crapshoot of a series of hacks.”

    Poništi
  16. 18. lis 2019.

    2/Things I want: less thinking about memory and performance, especially trying to avoid python... in python. More support for functional programming & types, easier ways to go from local machine to distributed, better integration between data processing and learning algorithms

    Prikaži ovu nit
    Poništi
  17. 18. lis 2019.

    1/So there is clearly a “mainstream” ML stack in python, eg: numpy/pandas/tf/pytorch. What things are exciting that are less mainstream? Jax? Julia? Swift? Some Apache thing?

    Prikaži ovu nit
    Poništi
  18. 17. lis 2019.

    “More layers is always better right?”

    Poništi
  19. 15. lis 2019.

    “plush giraffe perturbation” should be a mandatory evaluation metric in all future papers.

    Poništi
  20. proslijedio/la je Tweet

    The paper you stole might have taken those researchers a year or more of dedicated work to complete. You stole credit for all those late nights coding alone, hard days when nothing worked, and moments of personal self-doubt in the project and your excuse is a YouTube schedule?

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·