Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
Blokirali ste korisnika/cu @XiangZhou14
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @XiangZhou14
-
Xiang Zhou proslijedio/la je Tweet
An Opinionated Guide to ML Research: “To make breakthroughs with idea-driven research, you need to develop an exceptionally deep understanding of your subject, and a perspective that diverges from the rest of the community—some can do it, but it’s hard.” http://joschu.net/blog/opinionated-guide-ml-research.html …pic.twitter.com/fyO6cyr9im
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Here's a thread surveying some 'classic' work on
#compositionality. Lots of people seem to be discussing this right now, but with partial references to the whole story. My aim is to highlight some of the philosophical and psychological issues in the history of the concept. 1/Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
these are some of my nlp wishlists for 2020. what are yours?https://twitter.com/yoavgo/status/1205987145112051713 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
the kid on the bike claims sota, the one with the goggles is analyzing the model, and nobody is moving forward
#nlproc https://twitter.com/akkitwts/status/1204040275934318592 …Tweet je nedostupan.Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
we often under-teach the process (vs the techniques) of doing research. here is how I currently advise my PhD students. work in progress.https://medium.com/@paul.niehaus/doing-research-18cb310529e0 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
I'm starting a professorship in the CS department at UNC in fall 2020 (!!) and am hiring students! If you're interested in doing a PhD
@unccs please get in touch. More info here: https://cs.unc.edu/admissions/graduate/graduate-programs/ …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
New tech report with Junghyun Min and
@TalLinzen: "BERTs of a feather do not generalize together" Across 100 re-runs, BERT fine-tuned on MNLI has a consistent score on MNLI but extreme variation in syntactic generalization (measured w/ HANS). Link: https://tommccoy1.github.io/BERTs_of_a_feather.pdf … 1/7pic.twitter.com/oHNWU9rRnB
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
New paper! We perform a systematic study of transfer learning for NLP using a unified text-to-text model, then push the limits to achieve SoTA on GLUE, SuperGLUE, CNN/DM, and SQuAD. Paper: https://arxiv.org/abs/1910.10683 Code/models/data/etc: https://git.io/Je0cZ Summary
(1/14)pic.twitter.com/VP1nkkHefB
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Repeat 1-3 for 6 years, or until PhD is cooked through: 1. "What an interesting problem! Surely the model will learn intricate patterns..." 2. Machine learning derives spurious solution unrelated to underlying phenomenon 3. Tweak for 6+ months; remain faithful to original goal
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Who said that training GPT-2 or BERT was expensive? "We use 512 Nvidia V100 GPUs [...] Upon the submission of this paper, training has lasted for three months [...] and perplexity on the development set is still dropping."https://openreview.net/forum?id=Bkl8YR4YDB …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
New paper http://arxiv.org/abs/1909.09428 with Phil Blunsom and
@GaborMelis showing that regular LSTMs can "learn syntax" as well as the Ordered Neurons LSTMs of Shen et al. (ICLR 2019) ... but that's only because the "PRPN" parsing algorithm is biased. 1/2Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Our @emnlp2019 paper "Addressing Semantic Drift in QG for Semi-Supv QA" (w.
@mohitban47@UNCNLP): (1) we improve QG via 2 semantic-rewards (Ques-Paraphr + QA-Prob) (2) propose QA-Eval for QG as NLG metric (3) augment QA datasets by generating ques from existing/new articles.
1/2pic.twitter.com/tmNNkEpfTM
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Presenting LXMERT at @EMNLP2019 --> https://arxiv.org/abs/1908.07490 (prnc. 'leksmert'). Top3 in GQA & VQA challenges (May2019), Rank1 in VizWiz, & v.strong generalzn to NLVR2 (22% abs jump)! Awesome effort by
@HaoTan5! CODE+MODELS all public: https://github.com/airsplay/lxmert ; pls use+share! 1/2pic.twitter.com/WvxRirYGoB
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Wow. Dan Povey is leaving Hopkins http://danielpovey.com/leaving.html
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Do you ever have a model that uses
@PyTorch and one that uses@TensorFlow, and you want to combine the two for end-2-end training without rewriting either? TfPyTh allows you to plug one into the other while propagating gradients for training
Code
https://github.com/BlackHC/TfPyTh Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Think of it differently: the bigger models + bigger data trajectory is known, kinda boring, and will be done anyways at big-cos, who have plenty of capable people already. so what's your innovation potential there?
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Folks at
@NAACLHLT, we have multiple POSTDOC opportunities in@UNCNLP! Pls spread the word and ping me to chat about our several new projects/collaborations/hires
Also, check out 3 talks by #UNCNLP students/collaborators & SpLU-RoboNLP workshop (see next tweet):#NAACL2019https://twitter.com/mohitban47/status/1052367767708622848 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Overfitting in machine learning due to data set reuse turned out to be less of a problem than feared. There are at least four pieces to the puzzle that explain why. Thread.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
Introducing Remote Development for
@code

A new set of extensions that enable you to open any folder in a container, on a remote machine, or in the Windows Subsystem for Linux (WSL) and take advantage of VS Code's full feature set. #remote
https://aka.ms/vscode-remote/blog …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Xiang Zhou proslijedio/la je Tweet
HUGE NEWS: “For the first time, this study demonstrates that we can generate entire spoken sentences based on an individual’s brain activity."
#sciencehttps://www.ucsf.edu/news/2019/04/414296/synthetic-speech-generated-brain-recordings?utm_source=ucsf_tw&utm_medium=tw&utm_campaign=2019_speech_prosthesis&utm_term= …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.