Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @LChoshen
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @LChoshen
-
Prikvačeni tweet
RL weaknesses for MT was accepted to
@iclr_conf, see you there or because you can't wait: https://openreview.net/forum?id=H1eCw3EKvH …@AbendOmri@__lfxhttps://twitter.com/LChoshen/status/1146694121479966721 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Beware fellow submitters, remember to anonymizehttps://twitter.com/thegautamkamath/status/1224735338729369600 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Do you guys just put
#NLProc on everything you say? are there any other useful hashtags to know?Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
@tallinzen Thought it might interest you.\ have interesting thoughts.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
PS it all works better on ALBERT, but their suggestion is extreme and has some
#meta_learning Implications, so it is to be expected.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Something is rotten in the state of probing. An unintuitive way to boost BERT and many interesting implications. All layers can reconstruct?! https://arxiv.org/pdf/2001.09309.pdf …
#fresh#deepread@HsiehChunChengPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
Not just for students. I think people generally underestimate how much the system favors native English speakers. Just think about the extra time and effort it costs to write in another language, even if you feel relatively competent.https://twitter.com/tallinzen/status/1221104189780189184 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
The quiet semisupervised revolution continueshttps://twitter.com/D_Berthelot_ML/status/1219823580654948353 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
Let me highlight this amazing work I've read recently on
#compositionality in NLP, in which you'll find both: - a deep discussion of what it means for a neural model to be compositional - a deep and insightful comparison of LSTM, ConvNet & Transformers!
https://arxiv.org/abs/1908.08351 pic.twitter.com/LX9JQE1Ira
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
More layers give more expressiveness and performance. In which simple tasks (vision included) is the addition of layers really beneficial? Negative example: Resnet shows that 100 instead of 50 networks changes scores from 80 to 80.1.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Good news: I got only 4 papers. Bad news: not even one seems remotely related to my work. 3 embedding papers...
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#NLProc guys ready set go. Reviews arrived. How did you fare with the new system?Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
Some of the keywords in paper titles that have seen the most change from NeurIPS 2018 to 2019. - meta-learning, kernel methods, reinforcement learning are
- more hardware-aware, more theory-driven
- recurrent & convolutional get little love
Full NeurIPS recap coming soon!pic.twitter.com/vHYoW3LBR7
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Did not check it, but it's worth a shot.https://twitter.com/nasmoutiphd/status/1217208098429067265 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Public speaking tip (from the coursera 2ns course) : If something doesn't work, don't tell the audience how good it was, it puts emphasis on what they are missing.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Any suggested reading on meta-learning for DL (preferably in NLP perspective)? Includes: architecture learning, initialization learning, learning to learn, policy\loss learning etc.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yet another gender bias avoidance. Not! A real world (non cs) study of gender bias in language. https://journals.sagepub.com/doi/abs/10.1177/0956797619890619 …
@tmalsburg@tpoppels@roger_p_levyHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings
model inputs.
Welcome
Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony: -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...pic.twitter.com/1TfJ1Hm1xx
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I like this trend a lot, my only worry is that we are barking on the dataset trees (easy) and ignore the models (hard) . Well, actual general purpose models.https://twitter.com/ddua17/status/1215324852837502977 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Leshem Choshen proslijedio/la je Tweet
We want better reviewers? As a start, let's make this part of teaching curricula (and I don't just mean to use students as secondary reviewers in conferences)
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.