Could be of interest to your workhttps://twitter.com/zehavoc/status/1137111006000439296?s=19 …
-
-
-
Thank you! It's indeed related to our work, and we'll add a reference to it in our future version.
- Još 2 druga odgovora
Novi razgovor -
-
-
tl;dr: BERT and RoBERTa have some specialized attention heads that track specific dependency types, but they do not have generalist heads that can do holistic parsing
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
This is really interesting that there's not general syntactic modeling in BERT. It explains something I found recently: BERT leaks contextual information in non-syntactic ways which makes bias much worse and much harder to addresshttps://medium.com/@robert.munro/bias-in-ai-3ea569f79d6a …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.




