Loading [MathJax]/extensions/MathMenu.js
Multilingual Parameter-Sharing Adapters: A Method for Optimizing Low-Resource Neural Machine Translation | IEEE Conference Publication | IEEE Xplore

Multilingual Parameter-Sharing Adapters: A Method for Optimizing Low-Resource Neural Machine Translation


Abstract:

Adapter-based Multilingual Neural Machine Translation (MNMT) has become a significant approach in low-resource language translation by mitigating data imbalances between ...Show More

Abstract:

Adapter-based Multilingual Neural Machine Translation (MNMT) has become a significant approach in low-resource language translation by mitigating data imbalances between high-resource and low-resource language pairs and reducing training costs. However, existing adapter-based methods lack generalization in cross-lingual settings, particularly under low-resource conditions, where their scalability is limited. Additionally, current methods often introduce independent adapter modules for each language, leading to a linear increase in model parameters with the number of languages. To address these challenges, we propose a multilingual parameter-sharing adapter approach. Moreover, we introduce a neural architecture search (NAS)-based strategy to improve translation performance. Experimental results demonstrate that the multilingual parameter-sharing adapter exhibits competitive performance on both low-resource and high-resource datasets. The multilingual parameter-sharing adapter method has only 400K trainable parameters, which is 20× lower than the parameters of the traditional adapter method.
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information:

ISSN Information:

Conference Location: Hyderabad, India

Funding Agency:


References

References is not available for this document.