Ralph Brooks

@ralphbrooks

Seeking NLP, Deep Learning Consulting Opportunities.

Frisco, TX
Vrijeme pridruživanja: svibanj 2009.

Tweetovi

Blokirali ste korisnika/cu @ralphbrooks

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ralphbrooks

  1. Prikvačeni tweet
    24. sij
    Poništi
  2. proslijedio/la je Tweet
    4. velj

    I just published my first post on Medium "Is the future of Neural Networks Sparse?" . Enjoy!

    Poništi
  3. proslijedio/la je Tweet
    29. sij
    Poništi
  4. proslijedio/la je Tweet
    28. sij

    Hidden Markov Models have gotten a bit less love in the age of deep learning, but they are really nifty models that can learn even from tiny datasets. I’ve written a notebook introducing HMMs and showing how to implement them in PyTorch—check it out here:

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    17. sij

    Google's Reformer paper is a much bigger deal than it is being made out to be. BERT and GPT-2 were not practical given the insane training times/costs. Reformer has the potential to cause a mini NLP revolution over the next few months! -

    Poništi
  6. proslijedio/la je Tweet
    28. sij

    This textbook on NLP is just beautiful

    Poništi
  7. proslijedio/la je Tweet
    29. sij

    If you want to go beyond the stuff you will learn in MOOCs follow this class from (not an easy class if you take it seriously). Video lectures:

    Poništi
  8. proslijedio/la je Tweet
    23. sij

    How to “flex” when you’re a distributed employee: 1. Buy a fancy microphone for Zoom calls - you’ll sound like an talk show host. 2. Add unique background behind video calls - standard wallpaper is so 2010’s. 3. Lighting is everything - pick good light to look like a TV star.

    Poništi
  9. proslijedio/la je Tweet
    21. sij

    Open source alert 🚨 today we are sharing the code that accelerated BERT inference 17x and allowed us to use the model for web search at scale 🚄 code is available for both and . Thanks ML for the great collaboration!

    Poništi
  10. proslijedio/la je Tweet
    21. sij

    NLP community: Interpreting text models with Captum – an open source, extensible library for model interpretability built on PyTorch. Sentiment Analysis and interpreting BERT Models in the tutorials.

    Poništi
  11. proslijedio/la je Tweet
    19. sij
    Poništi
  12. proslijedio/la je Tweet

    Google Reformer: Transformer that can process text sequences of lengths up to 1 million words on a single accelerator using only 16GB of memory via

    Poništi
  13. proslijedio/la je Tweet
    14. sij

    Happy to release NN4NLP-concepts! It's a typology of important concepts that you should know to implement SOTA NLP models using neural nets: 1/3 We'll reference this in CMU CS11-747 this year, trying to maximize coverage. 1/3

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    13. sij

    We have open-sourced wav2letter@anywhere, an inference framework for online speech recognition that delivers state-of-the-art performance.

    Poništi
  15. proslijedio/la je Tweet
    13. sij

    I published a new article on the blog: Active Transfer Learning with PyTorch. Read about adapting Machine learning models with the knowledge that some data points will later get correct human labels, even if the model doesn't yet know the labels:

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    8. sij

    A good listing of tricks for ML research. Probably combining any two of these could get you a paper at a conference. Please don't do that blindly. Slide by .

    Poništi
  17. proslijedio/la je Tweet
    8. sij

    I prepared a new notebook for my Deep Learning class: Joint Intent Classification and Slot Filling with BERT: This a step by step tutorial to build a simple Natural Language Understanding system using the voice assistant dataset (English only).

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    5. sij

    I running a 60-minute webinar blitz on causal modeling in machine learning code examples THIS Thursday. Ideal for applied and practitioners interested in connections between dense causal inference lit and ML tools practice.

    Poništi
  19. proslijedio/la je Tweet
    26. pro 2019.

    I came up with some tricks to accelerate Transformer-XL. I hope you will find this post useful.

    Poništi
  20. 27. pro 2019.

    I just put together a notebook which gives BERT unknown text and which uses the Transformers library to generate sentiment classification from a SavedModel.

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·