In authors or contributors

1 resource

  • Lifeng Han, Gleb Erofeev, Irina Sorokina...
    |
    Oct 29th, 2023
    |
    preprint
    Lifeng Han, Gleb Erofeev, Irina Sorokina...
    Oct 29th, 2023

    Massively multilingual pre-trained language models (MMPLMs) are developed in recent years demonstrating superpowers and the pre-knowledge they acquire for downstream tasks. This work investigates whether MMPLMs can be applied to clinical domain machine translation (MT) towards entirely unseen languages via transfer learning. We carry out an experimental investigation using Meta-AI's MMPLMs ``wmt21-dense-24-wide-en-X and X-en (WMT21fb)'' which were pre-trained on 7 language pairs and 14...

Last update from database: 29/10/2025, 17:15 (UTC)
Powered by Zotero and Kerko.