Machine translation systems trained without parallel corpora are the coolest shit. *shaking head* Amazing.
First you find word vector correspondences: http://nlp.csai.tsinghua.edu.cn/~ly/papers/acl2017_zm.pdf … & http://www.aclweb.org/anthology/W/W16/W16-1614.pdf … Once you have multilingual embeddings, you can use denoising and back-translation to make full translation models: https://arxiv.org/pdf/1710.11041.pdf … https://arxiv.org/pdf/1711.00043.pdf …
-
-
They're not amazingly good at translating, but they are magical, and a small miracle is still worth marveling over.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.