Tweetovi

Blokirali ste korisnika/cu @mgrankin

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @mgrankin

  1. proslijedio/la je Tweet
    prije 5 sati

    "How to do machine learning efficiently". There's so much to love about this wonderful article.

    Poništi
  2. proslijedio/la je Tweet

    Pop Music Transformer: Generating Music with Rhythm and Harmony. Google drive full of samples and a pretrained model in the repo. I chose two randomly - these are listed as prompted samples. h/t abs: repo:

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    prije 15 sati
    Poništi
  4. proslijedio/la je Tweet
    prije 23 sata

    In January, , , and I ran a short class at on topics we think are missing in most CS programs — tools we use every day that everyone should know, like bash, git, vim, and tmux. And now the lecture notes and videos are online!

    Poništi
  5. proslijedio/la je Tweet
    prije 20 sati

    "We achieved an overall accuracy of 94.5%, more than 4.5% of an increase on the previous state-of-the-art"; classifying the patterns in 18,577 Scanning Electron Microscope images

    Poništi
  6. proslijedio/la je Tweet
    3. velj

    Added ImageNet validation results for 164 pretrained models on several datasets, incl ImageNet-A, ImageNetV2, and Imagenet-Sketch. No surprise, models with exposure to more data do quite well. Without extra, EfficientNets are holding their own.

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    3. velj

    Check out BREAK - a new NLU benchmark for testing the ability of models to break down a question into the required steps for computing its answer. A work by Tomer Wolfson, accepted to TACL 2020.

    Poništi
  8. proslijedio/la je Tweet
    2. velj
    Odgovor korisnicima i sljedećem broju korisnika:

    Each downstream packages has tests and they are run at package build time. Those tests could be rerun (without rebuilding) whenever a new version of an upstream dependency is released to continuously tag a Known Good Set of packages that work well together. The cost can be high.

    Poništi
  9. proslijedio/la je Tweet
    2. velj

    Some people asked about DAIN. It's Depth-Aware Frame Interpolation. I like to try low frame-rate sources like 16fps 8mm family films. Footage is supposedly anonymous but that sure looks like Stan Lee... project: Original Footage:

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    3. velj

    This repo is full of amazing awesomeness. I don't know of anything else like it. Independent refactored carefully tested implementations of modern CNNs

    Poništi
  11. proslijedio/la je Tweet
    3. velj

    Given that data loading can be a major bottleneck in many DL projects, this sounds like an interesting project to check out: "Accelerating Pytorch with Nvidia DALI" --> "on small models it's ~4X faster than the Pytorch dataloader"

    Poništi
  12. proslijedio/la je Tweet
    31. sij

    I highly recommend checking out the lecture series from "Full Stack Deep Learning" on YouTube

    Poništi
  13. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  14. proslijedio/la je Tweet
    31. sij

    If you're running xgboost, lgbm, or other forest based models in production you need to check out our new forest inference library. 40x faster predictions, cheaper than cpu and way less rack space.

    Poništi
  15. proslijedio/la je Tweet
    28. sij
    Odgovor korisniku/ci

    Even my continued fiddling with the SHA-RNN model shows there's a _lot_ to be studied and explored. I haven't published new incremental progress but you can tie the RNN across the 4 layers to substantially decrease total params yet get nearly equivalent perplexity results.

    Poništi
  16. proslijedio/la je Tweet
    28. sij

    New paper: Towards a Human-like Open-Domain Chatbot. Key takeaways: 1. "Perplexity is all a chatbot needs" ;) 2. We're getting closer to a high-quality chatbot that can chat about anything Paper: Blog:

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    29. sij

    New blog post: Contrastive Self-Supervised Learning. Contrastive methods learn representations by encoding what makes two things similar or different. I find them very promising and go over some recent works such as DIM, CPC, AMDIM, CMC, MoCo etc.

    Poništi
  18. proslijedio/la je Tweet
    29. sij

    Amazing work. PS: we're all so screwed.

    Poništi
  19. proslijedio/la je Tweet
    28. sij

    has officially released the Dataset discovery engine that helps Data scientists find useful datasets in a matter of a few clicks. If you have not tried it yet, check it out Fuel your models with a massive amount of data now

    Poništi
  20. proslijedio/la je Tweet
    28. sij

    Check out Meena, a new state-of-the-art open-domain conversational agent, released along with a new evaluation metric, the Sensibleness and Specificity Average, which captures basic, but important attributes for normal conversation. Learn more below!

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·