Evaluating the Cross-Lingual Effectiveness of Massively Multilingual Neural Machine Translation

Transfer of learning
DOI: 10.1609/aaai.v34i05.6414 Publication Date: 2020-06-29T19:07:58Z
ABSTRACT
The recently proposed massively multilingual neural machine translation (NMT) system has been shown to be capable of translating over 100 languages and from English within a single model (Aharoni, Johnson, Firat 2019). Its improved performance on low resource hints at potential cross-lingual transfer capability for downstream tasks. In this paper, we evaluate the effectiveness representations encoder NMT 5 classification sequence labeling tasks covering diverse set 50 languages. We compare against strong baseline, BERT (mBERT) (Devlin et al. 2018), in different learning scenarios show gains zero-shot 4 out these
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (13)