Sam Bowman je proslijedio/a tweet korisnika/cePhu Mon Htut
New analysis paper from my group! We zoom in on some of @clark_kev et al.'s on syntax-sensitive attention heads in BERT (+RoBERTa, +...), and find interestingly mixed results.https://twitter.com/phu_pmh/status/1199731562046201856 …
Sam Bowman je dodan/na,
Phu Mon Htut @phu_pmh
Do Attention Heads in BERT/RoBERTa Track Syntactic Dependencies? We (w. @zhansheng, @Shikh_kgp, @sleepinyourhat) perform an analysis of attention heads of BERT, fine-tuned BERTs, and RoBERTa to answer this: https://medium.com/@phu_pmh/do-attention-heads-in-bert-track-syntactic-dependencies-81c8a9be311a …
Prikaži ovu nit
08:50 - 27. stu 2019.
0 replies
12 proslijeđenih tweetova
69 korisnika označava da im se sviđa
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.