Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @YungSungChuang
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @YungSungChuang
-
Yung-Sung Chuang proslijedio/la je Tweet
We're standardizing OpenAI's deep learning framework on PyTorch to increase our research productivity at scale on GPUs (and have just released a PyTorch version of Spinning Up in Deep RL): https://openai.com/blog/openai-pytorch/ …pic.twitter.com/lgvqDdWDoB
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
New paper: Towards a Human-like Open-Domain Chatbot. Key takeaways: 1. "Perplexity is all a chatbot needs" ;) 2. We're getting closer to a high-quality chatbot that can chat about anything Paper: https://arxiv.org/abs/2001.09977 Blog: https://ai.googleblog.com/2020/01/towards-conversational-agent-that-can.html …pic.twitter.com/5SOBa58qx3
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
We're releasing mBART, a new seq2seq multilingual pretraining system for machine translation across 25 languages. It gives significant improvements for document-level translation and low-resource languages. Read our paper to learn more: https://arxiv.org/pdf/2001.08210.pdf …pic.twitter.com/tJbRcOTqik
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
Music you love, powered by AI

See how the streaming service QQ Music from Tencent uses TensorFlow to manage their extensive music library and enhance user experience!
Learn more → https://goo.gle/2u75wcM pic.twitter.com/zIffUbmHjSHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings
model inputs.
Welcome
Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony: -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...pic.twitter.com/1TfJ1Hm1xx
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
me trying to read http://arxiv.org https://twitter.com/Colinoscopy/status/1214987809465012224 …
0:15Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
Do neural networks learn what we think they learn?
@benbenhh reviews research that suggests that they often instead fall prey to the so-called Clever Hans effect and discusses its implications for NLP.#nlp#adversarialhttps://thegradient.pub/nlps-clever-hans-moment-has-arrived/ …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
.
@stanfordnlp people’s#ICLR2020 papers #2—ELECTRA:@clark_kev and colleagues (incl. at@GoogleAI) show how to build a much more compute/energy efficient discriminative pre-trainer for text encoding than BERT etc. using instead replaced token detection https://openreview.net/forum?id=r1xMH1BtvB …pic.twitter.com/lFAr6XYSWx
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
.
@stanfordnlp people’s#ICLR2020 papers #1—@ukhndlwl and colleagues (incl. at@facebookai) show the power of neural nets learning a context similarity function for kNN in LM prediction—almost 3 PPL gain on WikiText-103—maybe most useful for domain transfer https://openreview.net/forum?id=HklBjCEKvH …pic.twitter.com/5yKRhhjZMr
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
10 ML & NLP Research Highlights of 2019 New blog post on ten ML and NLP research directions that I found exciting and impactful in 2019. https://ruder.io/research-highlights-2019/ …pic.twitter.com/mPoKbkOcOW
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
I liked the LSH attention in the reformer https://openreview.net/forum?id=rkgNKkHtvB … Sparse, efficient, simple Dynamic sparse attn is fascinating & mostly dealt by – softmax+topK: Recurrent Independent Mech. (MILA) Product-Key Mem (FB) – 𝛂-entmax: Adap. Sparse Transformer (DeepSPIN) links
[1/3]pic.twitter.com/T5fxHdIktv
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
It's January 1st, which means...

we can FINALLY leave Python 2 behind!! 
pic.twitter.com/jaIPDTMpBw
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
After getting published in ICLR as an Independent Researcher, I have received nearly 100 messages from others who are looking to do the same. So I wrote a blog post on why I decided to do it and my advice to others.https://medium.com/@andreas_madsen/becoming-an-independent-researcher-and-getting-published-in-iclr-with-spotlight-c93ef0b39b8b?source=friends_link&sk=d930cb061089bb4428f1e6ef06958662 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
How to Read a Paper, by Srinivasan Keshav. http://blizzard.cs.uwaterloo.ca/keshav/home/Papers/data/07/paper-reading.pdf …pic.twitter.com/hyZnqBLAjd
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
BERT learns syntax and semantics, but what about real-world and common sense knowledge? Our paper proposes a new way to tech BERT about real-world entities. Congrats
@xwhan_,@WilliamWangNLP and@JefferyDuu for ICLR acceptance.https://openreview.net/forum?id=BJlzm64tDH …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
UMAP is often said to be superior to t-SNE because it is better at preserving global distances. Dmitry and I showed that this is because by default, t-SNE initializes randomly whereas UMAP initializes with Laplacian eigenmaps.https://twitter.com/hippopedoid/status/1207999178015727616 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
Yes! I got my first big conference paper accepted at ICLR, with spotlight! We improve the previous DeepMind paper "NALU" by 3x-20x. – This took 7-8 months, working without any funding as an independent researcher. Paper: https://openreview.net/forum?id=H1gNOeHKPS … Code: https://github.com/AndreasMadsen/stable-nalu …pic.twitter.com/7tBivzbyir
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
We spend our time finetuning models on tasks like text classif, NER or question answering. Yet
Transformers had no simple way to let users try these fine-tuned models.
Release 2.3.0 brings Pipelines: thin wrappers around tokenizer + model to ingest/output human-readable data.pic.twitter.com/ZcPTXOJsuS
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
ALBERT is a new, open-source architecture for natural language processing that achieves state-of-the-art performance on multiple benchmarks with ~30% fewer parameters than
#BERT. Learn all about it below:https://goo.gle/2QcWOBsHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Yung-Sung Chuang proslijedio/la je Tweet
Finally can reveal our
#ICLR2020 paper on ELECTRA, much more efficient than existing pretraining, state-of-the-art results; more importantly, trainable with one GPU! Key idea is to have losses on all tokens. Joint work@clark_kev ,@chrmanning,@quocleix. https://openreview.net/forum?id=r1xMH1BtvB … https://twitter.com/colinraffel/status/1197064951174533120 …pic.twitter.com/2MdLJRMmvz
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.