Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning
Article Status
Published
Authors/contributors
- Han, Lifeng (Author)
- Erofeev, Gleb (Author)
- Sorokina, Irina (Author)
- Gladkoff, Serge (Author)
- Nenadic, Goran (Author)
Title
Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning
Abstract
Massively multilingual pre-trained language models (MMPLMs) are developed in recent years demonstrating superpowers and the pre-knowledge they acquire for downstream tasks. This work investigates whether MMPLMs can be applied to clinical domain machine translation (MT) towards entirely unseen languages via transfer learning. We carry out an experimental investigation using Meta-AI's MMPLMs ``wmt21-dense-24-wide-en-X and X-en (WMT21fb)'' which were pre-trained on 7 language pairs and 14 translation directions including English to Czech, German, Hausa, Icelandic, Japanese, Russian, and Chinese, and the opposite direction. We fine-tune these MMPLMs towards English-\textit{Spanish} language pair which \textit{did not exist at all} in their original pre-trained corpora both implicitly and explicitly. We prepare carefully aligned \textit{clinical} domain data for this fine-tuning, which is different from their original mixed domain knowledge. Our experimental result shows that the fine-tuning is very successful using just 250k well-aligned in-domain EN-ES segments for three sub-task translation testings: clinical cases, clinical terms, and ontology concepts. It achieves very close evaluation scores to another MMPLM NLLB from Meta-AI, which included Spanish as a high-resource setting in the pre-training. To the best of our knowledge, this is the first work on using MMPLMs towards \textit{clinical domain transfer-learning NMT} successfully for totally unseen languages during pre-training.
Repository
arXiv
Archive ID
arXiv:2210.06068
Place
Toronto, Canada
Date
2023
Citation Key
han2023
Accessed
14/06/2024, 20:48
Library Catalogue
Extra
arXiv:2210.06068 [cs]
<标题>: 通过迁移学习研究面向临床领域的大型多语言预训练机器翻译模型
<AI Smry>: This is the first work on using MMPLMs towards clinical domain transfer-learning NMT successfully for totally unseen languages during pre-training and achieves very close evaluation scores to another MMPLM NLLB from Meta-AI.
Citation
Han, L., Erofeev, G., Sorokina, I., Gladkoff, S., & Nenadic, G. (2023). Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning (arXiv:2210.06068). arXiv. https://aclanthology.org/2023.clinicalnlp-1.5
Link to this record