Giovanni Petrantoni

@voidtarget

I like to research, create and sometimes destroy. From Italy (Sicily), I live in Japan (10 years+).

Kyoto City, Kyoto
Vrijeme pridruživanja: travanj 2010.

Tweetovi

Blokirali ste korisnika/cu @voidtarget

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @voidtarget

  1. Prikvačeni tweet
    17. kol 2019.

    My work desk in 2001.

    Poništi
  2. proslijedio/la je Tweet
    30. sij

    Wow.... and while using Windows 2000 by the looks of it.

    Poništi
  3. 31. sij

    This really shows that users experience and ease of use is indeed above all even in this field. Lots of lessons learned for new frameworks too! Also shows that users really don’t care about technical details and internal stuff (where I see other frameworks honestly superior).

    Poništi
  4. 28. sij

    Wow, I was looking for a, even local, hackmd like replacement! Had no idea you were working on this as well!

    Poništi
  5. 24. sij

    If you don’t know it already comby by is probably the best tool for code refactor you need. It saved me many hours of regex/(x paid tool) etc...

    Poništi
  6. proslijedio/la je Tweet
    20. sij

    No BLAS? No problem. This is the first post of the series implementing various algorithms in ANSI C with 0 dependencies for the DYI people. Stay tuned. Here' s #1: A 2-layer MLP getting beating OpenBLAS' performance and getting over 95 % in under 5s.

    Poništi
  7. proslijedio/la je Tweet
    21. sij

    One of the most important debugging technique of my life - numerical gradient check. This is a short intro and working piece of some for two models sharing the same grad-check scaffolding: a classifier and an Autoencoder. Pure Numpy!

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    16. sij

    It's easy to ridicule overconfidence. And indeed, it has its disadvantages. But it's just what you need if you want to take on a problem most rational people would conclude was too hard. Fools dare where angels fear to tread, and sometimes they win.

    Poništi
  9. 16. sij

    Alternatively could be even a better choice due to better user experience/tools while being still an Arch Linux core.

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    15. sij

    Differentiable Digital Signal Processing (DDSP)! Fusing classic interpretable DSP with neural networks. ⌨️ Blog: 🎵 Examples: ⏯ Colab: 💻 Code: 📝 Paper: 1/

    Prikaži ovu nit
    Poništi
  11. 14. sij

    The beautiful thing about is that sometimes it even makes ignorance about something worth it. Too much information, knowledge and theories often make us counterproductive. Resist bad influences! Educate yourself, dig into things, but most of all create; experiment!

    Poništi
  12. 12. sij

    Nice article! I'm glad to know I'm not a heretic. I know so many people that should read this :)

    Poništi
  13. 10. sij

    Lesson learned while making Garbage collected languages likely fail at for mainly 2 reasons: 1. The GC will be unaware of GPU memory, a tensor of GBs will be just few bytes for the GC 2. GCs make coroutines nearly impossible

    Poništi
  14. proslijedio/la je Tweet
    9. sij

    Recent language models like BERT and ERNIE rely on trendy layers based on transformer networks and require lots of compute. Deep learning researcher shows these layers and enormous GPUs may not be necessary:

    Poništi
  15. proslijedio/la je Tweet
    9. sij
    Odgovor korisnicima

    About it: and especially: . The recent XOR hype has been misunderstood and (wrongly) readapted to the ML world (in fact people try to read -wrongly- those findings with tools they are familiar with, like gaussian functions, etc).

    Poništi
  16. 9. sij

    Probably I will keep it in Nim for now, if possible, and just work on a better language agnostic high level interface!

    Prikaži ovu nit
    Poništi
  17. 9. sij

    Wondering if I should rewrite my unreleased framework in C++ or keep it in Nim (assuming ARC is stable)… proved that there is very little interest in such libraries written in not so popular languages sadly 😕

    Prikaži ovu nit
    Poništi
  18. 9. sij

    Really 1 ANN neuron XOR solving, evolved using GA (solved at generation 0… thanks to random number generator god) Apparently it found that a “Gaussian” activation was the key, just like the recent biological neuron paper hinted basically afaik.

    Poništi
  19. 7. sij

    AdderNet and DeepShift... could make tensor cores (in deep shift) obsolete! (btw, they won't last long one way or another)

    Poništi
  20. 6. sij

    Fift language By 's TON blockchain project Pretty cool to see a modern inspired language written in C++ Hidden gem there is a / interpreter written in Fift too!

    Poništi
  21. 6. sij

    More about spinlocks being bad ( and the previous article I linked too ! ), by Linus himself

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·