Tweetovi

Blokirali ste korisnika/cu @micmelesse

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @micmelesse

  1. proslijedio/la je Tweet
    28. pro 2019.

    Reformer: The Efficient Transformer They present techniques to reduce the time and memory complexity of Transformer, allowing batches of very long sequences (64K) to fit on one GPU. Should pave way for Transformer to be really impactful beyond NLP domain

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    11. sij

    On the Relationship between Self-Attention and Convolutional Layers This work shows that attention layers can perform convolution and that they often learn to do so in practice. They also prove that a self-attention layer is as expressive as a conv layer.

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    6. sij

    Got my final semester of grades back and I passed! (Barely with all Cs 😂) I'm a big believer though that sometimes caring less about your grades is the BEST thing you can do. Build cool things for fun even if your grades sometimes suffer.

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    4. sij

    * AddisCoder 2020 dates now fixed: July 20 -- August 14 (though TAs should be available starting July 13 for program prep). * Student application is live! Students, please apply at . See for more details.

    Poništi
  5. proslijedio/la je Tweet

    ... 3) Please feel free to connect me with your friends and acquaintances. Please DM me. I’m not sure how I’ll publish my complied work, but I’m hoping to do my bit to preserve a part of our culture that I care deeply about. Thanks for all the help!

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet

    Hello Friends. I’m hoping to undertake a somewhat exhaustive survey of (modern) Ethiopian Art in any/all forms: fine art, photography, music, theater, literature etc. Please help me by suggesting 1) Folks I can read about/talk to/visit 2) Books and other helpful resources ...

    Prikaži ovu nit
    Poništi
  7. 22. pro 2019.

    The watchmen show was way better than I thought it would be.

    Poništi
  8. proslijedio/la je Tweet
    20. pro 2019.

    Yes! I got my first big conference paper accepted at ICLR, with spotlight! We improve the previous DeepMind paper "NALU" by 3x-20x. – This took 7-8 months, working without any funding as an independent researcher. Paper: Code:

    Prikaži ovu nit
    Poništi
  9. 7. lis 2019.

    Favorite drawing that I did for this week.

    Poništi
  10. 29. ruj 2019.

    Studies of Roger Radcliffe from 101 Dalmatians.

    Poništi
  11. 29. kol 2019.

    Trying to figure out how to add more details and definition.

    Poništi
  12. 28. kol 2019.
    Poništi
  13. 18. kol 2019.

    First colored animation of a simple broken robot character.

    Poništi
  14. 5. kol 2019.
    Poništi
  15. 21. srp 2019.

    Did a study of the Walkcycle of the Knight from .

    Poništi
  16. 30. lip 2019.

    Finally finished my first walk cycle

    Poništi
  17. 23. lip 2019.

    Playing around with ink drawing. Decided to do Tyrion from

    Poništi
  18. 5. lip 2019.

    Did my first bit of , the classic Bouncing Ball Animation.

    Poništi
  19. proslijedio/la je Tweet
    25. tra 2019.

    Like honeybadgers, Transformers truly don't care what data to learn. Here's a transformer generating multiple minutes of creative coherent music:

    Poništi
  20. proslijedio/la je Tweet
    12. tra 2019.

    In the Transformer architecture (), a Positional Embedding (PE) is added to each word embedding. This allows the model to know the word positions, both absolute and relative. Here's a diagram to understand how it works. 🦎

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·