I read this a while ago and thought it was a joke. Is this actually serious? I loved the "take that, sesame street" line
-
-
-
I may have gone a little mad on the prose but the model architecture and results are entirely real - an LSTM with a single head of attention is at least competitive with Transformer architectures :)
- Još 1 odgovor
Novi razgovor -
-
-
"The attention mechanism is also readily extended to large contexts with minimal computation. Take that Sesame Street. " This was their reaction: https://i.imgur.com/SzhmLOg.jpg
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
I like the tone of the abstract. Maybe can read it further. :)
-
The whole paper is hilarious, but the results are pretty solid. Perhaps this is the new fashion for ML manuscripts? Make sure you catch the Star Wars and Minecraft bits.
Kraj razgovora
Novi razgovor -
-
-
good job
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Dumb question.....what is evergreen wikitext initiative...... can't find any reference anywhere...from the short description in the paper... sounds very interesting....
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Best Abstract Ever!
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.