Efficiently Reusing Old Models Across Languages via Transfer Learning

Tom Kocmi, Ondřej Bojar


Abstract
Recent progress in neural machine translation (NMT) is directed towards larger neural networks trained on an increasing amount of hardware resources. As a result, NMT models are costly to train, both financially, due to the electricity and hardware cost, and environmentally, due to the carbon footprint. It is especially true in transfer learning for its additional cost of training the “parent” model before transferring knowledge and training the desired “child” model. In this paper, we propose a simple method of re-using an already trained model for different language pairs where there is no need for modifications in model architecture. Our approach does not need a separate parent model for each investigated language pair, as it is typical in NMT transfer learning. To show the applicability of our method, we recycle a Transformer model trained by different researchers and use it to seed models for different language pairs. We achieve better translation quality and shorter convergence times than when training from random initialization.
Anthology ID:
2020.eamt-1.3
Volume:
Proceedings of the 22nd Annual Conference of the European Association for Machine Translation
Month:
November
Year:
2020
Address:
Lisboa, Portugal
Editors:
André Martins, Helena Moniz, Sara Fumega, Bruno Martins, Fernando Batista, Luisa Coheur, Carla Parra, Isabel Trancoso, Marco Turchi, Arianna Bisazza, Joss Moorkens, Ana Guerberof, Mary Nurminen, Lena Marg, Mikel L. Forcada
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
19–28
Language:
URL:
https://aclanthology.org/2020.eamt-1.3
DOI:
Bibkey:
Cite (ACL):
Tom Kocmi and Ondřej Bojar. 2020. Efficiently Reusing Old Models Across Languages via Transfer Learning. In Proceedings of the 22nd Annual Conference of the European Association for Machine Translation, pages 19–28, Lisboa, Portugal. European Association for Machine Translation.
Cite (Informal):
Efficiently Reusing Old Models Across Languages via Transfer Learning (Kocmi & Bojar, EAMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.eamt-1.3.pdf
Data
WMT 2018