Tweetovi

Blokirali ste korisnika/cu @Bharath09095865

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @Bharath09095865

  1. proslijedio/la je Tweet
    31. sij
    Poništi
  2. proslijedio/la je Tweet
    30. sij

    Had a great time talking about T5 and chatting with students yesterday! I'm waiting to put slides online until I finish annotating them; in the meantime here is a recording of the same talk from when I gave it earlier this month:

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    29. sij

    This was by far the most difficult (and interesting) post I have worked on. Tokenization is a really exciting field of research in and of itself. Loved working with to get it published

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    Odgovor korisnicima

    We tie 6 strongly simplified RNN layers in a CPU-optimized transformer hybrid with very little quality loss. Been doing that for two years now in production, wrote about it only now :)

    Poništi
  5. proslijedio/la je Tweet
    26. sij

    I have 8 free copies of ML Bookcamp To get one: - Follow me - Retweet this tweet ML Bookcamp: a project-based way to learn ML. Get a portfolio and the skills to work as a data scientist or ML engineer Results on Wednesday

    Poništi
  6. proslijedio/la je Tweet
    25. sij
    Odgovor korisniku/ci

    I teach a general ML course with minimal deep learning targeted toward biomedical grad students (not CS), and I teach EM. They struggle, but latent variables are important if you want to do any causal/mechanistic analysis.

    Poništi
  7. proslijedio/la je Tweet
    14. sij

    Transformers are great at modeling sequences, but the attention layer requires L^2 memory, where L is the length of the sequence. Nikita Kitaev, , and Anselm Levskaya show how to decrease memory footprint. Very promising for biological data!

    Poništi
  8. proslijedio/la je Tweet
    5. tra 2018.

    PS - The prettier DNS for the explorer is now available here:

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    11. sij

    Current image datasets are getting very large and pre-training on them is time consuming. We are releasing Neural Data Server (NDS), a search engine for transfer learning data! Webservice: Paper:

    Poništi
  10. 7. sij

    I like this new coinage of term " Cognitive automation " rather than "Artificial Intelligence "

    Poništi
  11. 30. pro 2019.

    I think when any field grows and becomes lucrative in terms of money and fame, all the wicked behaviors of human nature come out. ML, DL, AI are in gold rush mode. Such things are not unexpected.

    Poništi
  12. proslijedio/la je Tweet
    30. pro 2019.

    Seeing since 2013, it seems there was way less hatred and jealousy and people used to share and help others learn. Now it's all about attacking others. These days kagglers try to pick points from other's lifes to bully them. No wonder why so many legends retired.

    Poništi
  13. proslijedio/la je Tweet
    25. pro 2019.

    Episode1: Building a machine learning framework. Premiers in 10hours!!!!

    Poništi
  14. proslijedio/la je Tweet
    14. pro 2019.

    sandesh (संदेश) in Hindi means message. This is a simple python library to send messages to Slack using webhook urls. You can install this using "pip install sandesh" Check it out here:

    Poništi
  15. proslijedio/la je Tweet
    10. pro 2019.

    Do you have a core intuition behind dimension reduction? gives a great guide to understand most dimension reduction techniques: ➗Matrix factorization or, 🕸️Neighbour graph ▶️ He will be with us at at

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    9. pro 2019.

    As promised, we have made the Text-To-Text Transfer Transformer (T5) models much easier to fine-tune for new tasks, and we just released a Colab notebook where you can try it yourself on a free TPU! 👇 (1/3)

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet

    Human Learning is not achieved by chance, it must be sought with zeal & worked with diligence. Evolution doesn’t have this flavor on the surface, since it’s concerned with “only” survival. Until you realize how vicious nature can be, also demanding zeal & diligence. Poor 🦖 🦕

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet

    Unsupervised Sentiment Analysis by Rafał Wójcik

    Poništi
  19. proslijedio/la je Tweet

    Bernhard Scholkopf () just published a single author paper titled "Causality for Machine Learning" (); this should probably at the top of the reading list for many people interested in machine learning / AI;

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet
    25. stu 2019.

    "Neural Random Forest Imitation," Reinders and Rosenhahn: "Without any additional training data, this transformation [RF --> data --> NN] creates very efficient neural networks that learn the decision boundaries of a random forest."

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·