Hrituraj singh

@hrituraj1997

Researcher , IIT Roorkee, Mostly eat GPUs for breakfast

Bangalore, India
Vrijeme pridruživanja: siječanj 2011.
Rođen/a 21. ožujka 1997.

Tweetovi

Blokirali ste korisnika/cu @hrituraj1997

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @hrituraj1997

  1. proslijedio/la je Tweet
    prije 23 sata

    In my career, I've worked on speech, NLP, and vision research. Personally, vision projects are most appealing and fun to show off. NLP research is notoriously hard yet most insightful. I feel most satisfied when I work on speech. But I would be bored if I did only one of them.

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    19. sij

    In case you need to motivate research on MT for low resource languages, the diplomatic business case would seem to be clear...

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    18. sij

    The history of AI defeating humans in games, in one chart. (v/)

    Poništi
  4. proslijedio/la je Tweet
    6. sij

    10 ML & NLP Research Highlights of 2019 New blog post on ten ML and NLP research directions that I found exciting and impactful in 2019.

    Poništi
  5. proslijedio/la je Tweet
    2. sij

    The 2010s were an eventful decade for NLP! Here are ten shocking developments since 2010, and 13 papers* illustrating them, that have changed the field almost beyond recognition. (* in the spirit of and , exclusively from other groups :)).

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    1. sij

    Happy New Decade! In the last decade AI went from niche to mainstream. I wonder what our community will accomplish this decade?

    Poništi
  7. proslijedio/la je Tweet
    16. pro 2019.

    received 3,429 submissions --- a new record for an conference! In fact, this is the first time a *CL conference has received over 3,000 submissions.

    Poništi
  8. 11. pro 2019.
    Poništi
  9. proslijedio/la je Tweet
    9. pro 2019.

    because there's no way to check whether I filled out the reviewer/author form already, I may have filled it out 2+ times ... apologies in advance to coauthors if our papers are rejected over this

    Poništi
  10. proslijedio/la je Tweet
    26. stu 2019.

    Introducing the SHA-RNN :) - Read alternative history as a research genre - Learn of the terrifying tokenization attack that leaves language models perplexed - Get near SotA results on enwik8 in hours on a lone GPU No Sesame Street or Transformers allowed.

    The SHA-RNN is composed of an RNN, pointer based attention, and a “Boom” feed-forward with a sprinkling of layer normalization. The persistent state is the RNN’s hidden state h as well as the memory M concatenated from previous memories. Bake at 200◦F for 16 to 20 hours in a desktop sized oven.
    The attention mechanism within the SHA-RNN is highly computationally efficient. The only matrix multiplication acts on the query. The A block represents scaled dot product attention, a vector-vector operation. The operators {qs, ks, vs} are vectorvector multiplications and thus have minimal overhead. We use a sigmoid to produce {qs, ks}. For vs see Section 6.4.
    Bits Per Character (BPC) onenwik8. The single attention SHA-LSTM has an attention head on the second last layer and hadbatch size 16 due to lower memory use. Directly comparing the head count for LSTM models and Transformer models obviously doesn’tmake sense but neither does comparing zero-headed LSTMs against bajillion headed models and then declaring an entire species dead.
    Poništi
  11. proslijedio/la je Tweet

    Today I want to take a break from sharing research to share a personal story instead. It’s a story about my name, why I once decided to quit academia, why I came back, what I learnt from it, and why I’m grateful to have an audience here on Twitter.

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·