Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @_TobiasLee
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @_TobiasLee
-
Lei Li proslijedio/la je Tweet
“Meet AdaMod: a new deep learning optimizer with memory” by Less Wrighthttps://link.medium.com/CfAtYwooa3
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
10 ML & NLP Research Highlights of 2019 New blog post on ten ML and NLP research directions that I found exciting and impactful in 2019. https://ruder.io/research-highlights-2019/ …pic.twitter.com/mPoKbkOcOW
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
together,make Arsenal great againhttps://twitter.com/Arsenal/status/1209135616333697025 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
New NLP News: 2020 NLP wish lists, HuggingFace + fastai, NeurIPS 2019, GPT-2 things, Machine Learning Interviews http://newsletter.ruder.io/archive/211277 via
@revueHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Still a lot to explore about GNN modelshttps://twitter.com/xusun26/status/1209108550426869762 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
Our recent work to be presented at NeurIPS 2019: Understanding and improving layer normalization-- We find that the normalization effects on derivatives are much more important than forward normalization. Paper: https://arxiv.org/abs/1911.07013
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance. arxiv: https://arxiv.org/abs/1911.09483 code:https://github.com/lancopku/MUSE
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
Introducing the SHA-RNN :) - Read alternative history as a research genre - Learn of the terrifying tokenization attack that leaves language models perplexed - Get near SotA results on enwik8 in hours on a lone GPU No Sesame Street or Transformers allowed. https://arxiv.org/abs/1911.11423 pic.twitter.com/RN5TPZ3xWH
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.https://www.reddit.com/r/MachineLearning/comments/e13qhb/r_a_simple_module_consistently_outperforms/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
slides from my keynote talk at
#emnlp2019https://drive.google.com/file/d/1HGzv6n9hAj-GL63POUZCO6nCrIHF9y35/view?usp=sharing …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
This slide from
@JesseDodge 's (excellent) @emnlp2019 talk speaks to me on a spiritual level. From: Show Your Work: Improved Reporting of Experimental Results. Jesse Dodge, Suchin Gururangan, Dallas Card, Roy Schwartz and Noah A. Smith Paper link: https://www.aclweb.org/anthology/D19-1224.pdf …pic.twitter.com/wZL9PzoEDD
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
New blog post (...and new blog) on how to build a deep learning framework from scratch in 100 LOC https://eisenjulian.github.io/deep-learning-in-100-lines … . Let me know what you think!
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Lei Li proslijedio/la je Tweet
#CloudTPUs +@TensorFlow are the fastest platform for ML. On the popular (and now quite silly) ResNet-50 ImageNet benchmark, one TPUv3 Pod finishes in 2.2 minutes at full accuracy (76.3% top-1)! Approx 2x faster than extremely large GPU clusters. Details: https://arxiv.org/abs/1811.06992Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
"Pattern Recognition and Machine Learning" by
@ChrisBishopMSFT is now available as a free download. Download your copy today for an introduction to the fields of pattern recognition & machine learning: http://aka.ms/prml#ML#Insightspic.twitter.com/XiPYZvCrXl
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lei Li proslijedio/la je Tweet
A new era of NLP has just begun a few days ago: large pretraining models (Transformer 24 layers, 1024 dim, 16 heads) + massive compute is all you need. BERT from
@GoogleAI: SOTA results on everything https://arxiv.org/abs/1810.04805 . Results on SQuAD are just mind-blowing. Fun time ahead!pic.twitter.com/1phsCZpqWR
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Long time no see! I'm online again just for twitter dataset QAQ
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
怕什么真理无穷,进一寸有一寸的欢喜。 Hello,Twitter!
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.