Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @shfaithy
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @shfaithy
-
Faith Yang proslijedio/la je Tweet
Let me highlight this amazing work I've read recently on
#compositionality in NLP, in which you'll find both: - a deep discussion of what it means for a neural model to be compositional - a deep and insightful comparison of LSTM, ConvNet & Transformers!
https://arxiv.org/abs/1908.08351 pic.twitter.com/LX9JQE1Ira
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Any time you tell a project maintainer that something is "simple": 1. You're wrong 2. The code change may be small, but there's other process-related details like tests, documentation, etc. 3. You just insulted the maintainer by saying you know better than them 4. You're wrong
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
How To Be Successful (At Your Career, Twitter Edition)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
I liked the LSH attention in the reformer https://openreview.net/forum?id=rkgNKkHtvB … Sparse, efficient, simple Dynamic sparse attn is fascinating & mostly dealt by – softmax+topK: Recurrent Independent Mech. (MILA) Product-Key Mem (FB) – 𝛂-entmax: Adap. Sparse Transformer (DeepSPIN) links
[1/3]pic.twitter.com/T5fxHdIktv
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
so, here are a bunch of stuff i find interesting. no particular order. and definitely not comprehenssive. - creative ways to apply massive LMs. Sure we can fine-tune them with extra supervision. What else can we do with them?
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
New work with Kazuma Hashimoto,
@HannaHajishirzi,@RichardSocher, and@CaimingXiong at@SFResearch and@uwnlp! Our trainable graph-based retriever-reader framework for open-domain QA advances state of the art on HotpotQA, SQuAD Open, Natural Questions Open.
1/7pic.twitter.com/uVoWVEHsAq
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
“The black box argument is bogus … brains are also black boxes, and we’ve made a lot of progress in understanding how brains work.” Top minds in machine learning predict where AI is going in 2020 https://venturebeat.com/2020/01/02/top-minds-in-machine-learning-predict-where-ai-is-going-in-2020/ … via
@VentureBeatHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
At the start of 2019, I wrote down every lesson I learned throughout the year. Some are personal, others professional. Some are small, others bigs. But they were all learned from experience, not clipped from a book. Here are a few.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
I wouldn't recommend basing your career on currently popular trends, since these are likely to change by the time you graduate. Instead, figure out what questions/problems most fascinate you and how to make a career out of those. Define your own fields if you have to.https://twitter.com/arkosiorek/status/1211626126134718464 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
“To start a PhD in ML, without insider referral, you need to do work equivalent to half of a PhD. Hence, in Apr 2019, I decided to dedicate all my time until Jan 2020 to publish in either NeurIPS or ICLR. If I fail, I would become a JavaScript programmer.” —
@andreas_madsen
https://twitter.com/andreas_madsen/status/1211329218619092993 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
So many great lessons in Richard Hamming's "You and Your Research" (1986): - "Continue to plant the little acorns from which the mighty oak trees grow." - Change a "defect to an asset". - "Just hard work is not enough—it must be applied sensibly." https://www.cs.virginia.edu/~robins/YouAndYourResearch.html …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Ever wondered what is sound? How does it get stored inside the computer? I try to answer these and similar questions in this repository https://github.com/earthspecies/from_zero_to_DSP … + make your computer sound like a violin / harpsichord, play sounds directly from
@ProjectJupyter NBs, and more!pic.twitter.com/VZ0KFteb1y
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Excited to give a guest lecture at
@full_stack_dl later today: "Machine learning interviews: Lessons from both sides". Here are the slides for those who want to follow along. Feedback & questions welcome!https://docs.google.com/presentation/d/1MX2V6fTp71j1aztvY5HLYM44iLG4HYMrYd4Dxn6Cxnw/edit?usp=sharing …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Useful Paper Writing (and reviewing) tips from
@inkynumbers.#iccv2019#ICCV19pic.twitter.com/r3xcONlKM0
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Information Bottleneck
in action at #emnlp2019! (1) Specializing embeddings for parsing (https://arxiv.org/abs/1910.00163 ) by Xiang &@adveisner (2)
BottleSum
unsupervised & self-supervised summarization (https://arxiv.org/abs/1909.07405 ) with @PeterWestTM@universeinanegg@janmbuysPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Next week at EMNLP, Aditya Gupta will be presenting his work on "Effective Use of Transformer Networks for Entity Tracking". This paper studies procedural text: descriptions of processes involving complex entity interactions like recipes, scientific processes, etc 1/n
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Exciting work by
@EasonNie (+@adinamwilliams@em_dinan@jaseweston@douwekiela)! Adversarial NLI, a large dataset collected via a multi-round adversarial (weakness-finding) human-&-model-in-the-loop process; allows moving/lifelong-learning target for NLU
https://arxiv.org/abs/1910.14599 https://twitter.com/douwekiela/status/1190083615738277888 …pic.twitter.com/eoZpeLIsRq
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Excited to share our work on BART, a method for pre-training seq2seq models by de-noising text. BART outperforms previous work on a bunch of generation tasks (summarization/dialogue/QA), while getting similar performance to RoBERTa on SQuAD/GLUE
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
PyTorch-Struct (v0.3 https://github.com/harvardnlp/pytorch-struct …). New features: autoregressive models / beam search, sparse-max dp, alignment/dtw, parallel semi-markov, k-max, pretty docs (http://harvardnlp.github.io/pytorch-struct ) Fun example: gradients of time-warping crf under different semirings.pic.twitter.com/HB3umU5jgd
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Faith Yang proslijedio/la je Tweet
Thoughts after reading the T5 paper of
@colinraffel et al. Thread. An amazing paper (requiring significant compute) that teases apart the effect of various ingredients proposed in Muppetland in the last few months (years?). Some things that stood out / were surprising:Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.