Am I off-base in thinking it's weird that there are so many NLP tutorials for translation? Like, approximately nobody needs to make MT models, and the people working on MT know who they are and don't need tutorials. I guess people use them for MT-analogous seq2seq tasks?
-
-
Odgovor korisniku/ci @honnibal
Yes, with MT, you can teach broad concepts like seq2seq and attention, which is why I wrote this NMT tutorial 3 years ago https://github.com/tensorflow/nmt . One may also view translation as the first area that deep learning works at scale for NLP.
1 reply 0 proslijeđenih tweetova 19 korisnika označava da im se sviđa
Odgovor korisniku/ci @lmthang
Valid points.
18:03 - 8. sij 2020.
0 replies
0 proslijeđenih tweetova
2 korisnika označavaju da im se sviđa
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
Author of the
Founder