Knowledge Transfer in Incremental Learning for Multilingual Neural Machine Translation
Knowledge Transfer
Transfer of learning
Language Understanding
Training set
DOI:
10.18653/v1/2023.acl-long.852
Publication Date:
2023-08-04T20:57:42Z
AUTHORS (5)
ABSTRACT
In the real-world scenario, a longstanding goal of multilingual neural machine translation (MNMT) is that single model can incrementally adapt to new language pairs without accessing previous training data. this studies concentrate on overcoming catastrophic forgetting while lacking encouragement learn knowledge from incremental pairs, especially when not related set original languages. To better acquire knowledge, we propose transfer method efficiently MNMT models diverse pairs. The flexibly introduces an external into models, which encourages completing procedure transfer. Moreover, all parameters are frozen ensure qualities degraded. Experimental results show our meanwhile maintaining performance outperforming various strong baselines in learning for MNMT.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (4)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....