Dynamic Curriculum Learning for Low-Resource Neural Machine Translation
Training set
DOI:
10.18653/v1/2020.coling-main.352
Publication Date:
2021-01-08T13:58:31Z
AUTHORS (9)
ABSTRACT
Large amounts of data has made neural machine translation (NMT) a big success in recent years. But it is still challenge if we train these models on small-scale corpora. In this case, the way using appears to be more important. Here, investigate effective use training for low-resource NMT. particular, propose dynamic curriculum learning (DCL) method reorder samples training. Unlike previous work, do not static scoring function reordering. Instead, order dynamically determined two ways - loss decline and model competence. This eases by highlighting easy that current enough competence learn. We test our DCL Transformer-based system. Experimental results show outperforms several strong baselines three benchmarks different sized WMT’16 En-De.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (13)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....