Loading [MathJax]/extensions/MathMenu.js
Transfer Learning Grammar for Multilingual Surface Realisation | IEEE Conference Publication | IEEE Xplore

Transfer Learning Grammar for Multilingual Surface Realisation


Abstract:

Deep learning approaches to surface realisation are often held back by the lack of good quality datasets. These datasets require significant human effort to design and ar...Show More

Abstract:

Deep learning approaches to surface realisation are often held back by the lack of good quality datasets. These datasets require significant human effort to design and are rarely available for low-resource languages. We investigate the possibility of cross-lingual transfer learning of grammatical features in a multilingual text-to-text transformer. We train several mT5-small transformer models to generate grammatically correct sentences by reordering and inflecting words, first using monolingual data in one language and then in another language. We show that language comprehension and task-specific performance of the models benefit from pretraining on other languages with similar grammar rules, while languages with dissimilar grammar appear to disorient the model from its previous training. The results indicate that a model trained on multiple languages may familiarize itself with their common features and, thus, require less data and processing time for language-specific training. However, the experimental models are limited by their entirely text-to-text approach and insufficient computational power. A complete multilingual realisation model will require a more complex transformer variant and longer training on more data.
Date of Conference: 20-21 May 2021
Date Added to IEEE Xplore: 02 June 2021
ISBN Information:
Conference Location: Islamabad, Pakistan

Contact IEEE to Subscribe

References

References is not available for this document.