Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @v_trokhymenko
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @v_trokhymenko
-
:feelsgoodmeme: :ods: :muscle:https://twitter.com/tiulpin/status/1224805474924744713 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
thx for the blog post btw, u forgot to add Russia BERT model == RuBERT by
@deeppavlov :muscle: :ru:https://twitter.com/omarsar0/status/1223945187388424192 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
This is a list of my top 10 book recommendations for learning the nitty-gritty of NLP and ML by
@omarsar0https://medium.com/dair-ai/top-10-recommended-ml-and-nlp-books-cd5344b36a9d …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Python 3.9.0a3 now available for testing https://goo.gl/fb/g3UXhf
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Transformers 2.4.0 is out
- Training transformers from scratch is now supported
- New models, including *FlauBERT*, Dutch BERT, *UmBERTo*
- Revamped documentation
- First multi-modal model, MMBT from @facebookai, text & images Bye bye Python 2
https://github.com/huggingface/transformers/releases …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je TweetHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
viktor trokhymenko proslijedio/la je Tweet
#StyleGAN2 model with@jm_alexia SVM-gan penalty loss, I think, if I did it right ;)pic.twitter.com/2JRM38145gPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
I had another conversation with Meena just now. It's not as funny and I don't understand the first answer. But the replies to the next two questions are quite funny.pic.twitter.com/lpOZpsvDck
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
For data science projects, setting up remote computing infrastructure can be more complicated than modeling itself. Thanks
@bertBesser and Marcel Mikl for sharing their setup for training ML models in the cloud.https://blog.codecentric.de/en/2020/01/remote-training-gitlab-ci-dvc/ …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
I'm happy to announce our latest work on self-supervised learning for
#speech. PASE+ is based on a multi-task approach useful for#speech recognition. It will be presented at#ICASSP2020. paper: https://arxiv.org/abs/2001.09239 code: https://github.com/santi-pdp/pase @Mila#deeplearning#AIpic.twitter.com/ynWReEHGn4
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Introducing
#MeenaBot, a 2.6B-param open-domain chatbot with near-human quality. Remarkably, we show strong correlation between perplexity & humanlikeness! Paper: https://arxiv.org/abs/2001.09977 Sample conversations: https://github.com/google-research/google-research/tree/master/meena … https://twitter.com/GoogleAI/status/1222230622355087360 …pic.twitter.com/3xNSV4r4uB
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via
@revue)http://newsletter.ruder.io/archive/217744Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Wow, more than 200 NLP datasets – this is gold. https://quantumstat.com/dataset/dataset.html … Made by
@Quantum_Stat Seen:@pandeyparulpic.twitter.com/ZDxazkBLOp
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
[1/2] Excited to present SMART: Semi-Autoregressive Training for Conditional Masked Language Models. SMART closes the performance gap between semi- and fully-autoregressive MT models, while retaining the benefits of fast parallel decoding. With
@omerlevy_@LukeZettlemoyerPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
My pet project, AVERAGE ART: averaged faces from classic portraits by art styles. Details: https://medium.com/@altsoph/average-art-a917340cd7fa …pic.twitter.com/LTeoAoyMDZ
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Excited to share PCGrad, a super simple & effective method for multi-task learning & multi-task RL: project conflicting gradients On Meta-World MT50, PCGrad can solve *2x* more tasks than prior methods https://arxiv.org/abs/2001.06782 w/ Tianhe Yu, S Kumar, Gupta,
@svlevine,@hausman_kpic.twitter.com/uTeUhULUTA
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
Curious what people may mean when they say a neural network is (not) compositional? And how that relates to linguistics and philosophy literature on compositionality? Check our new paper on compositionality in neural networks: https://arxiv.org/pdf/1908.08351.pdf …!pic.twitter.com/Z9bj1FYKiS
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
oh yeah, it really looks coolhttps://twitter.com/jetbrains/status/1217467436850651137 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
A belated blog post for our BERTology EMNLP paper (by Olga Kovaleva, Alexey Romanov, yours truly and
@arumshisky). My favorite experiment in this work is showing that for most GLUE tasks BERT works pretty well even *without pre-training*!https://text-machine-lab.github.io/blog/2020/bert-secrets …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
viktor trokhymenko proslijedio/la je Tweet
I did a deep-dive into
@GitHub Pages, and found it's possible to create a *really* easy way to host your own blog: no code, no terminal, no template syntax. I made "fast_template" to pull this together, & a guide showing beginners how to get blogginghttps://www.fast.ai/2020/01/16/fast_template/ …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
NLP Newsletter #3: Flax, Thinc, Language-specific BERT models, Meena, Flyte, LaserTagger,…