In authors or contributors
Publication year

1 resource

  • Lifeng Han, Gleb Erofeev, Irina Sorokina...
    |
    Oct 30th, 2023
    |
    preprint
    Lifeng Han, Gleb Erofeev, Irina Sorokina...
    Oct 30th, 2023

    Massively multilingual pre-trained language models (MMPLMs) are developed in recent years demonstrating superpowers and the pre-knowledge they acquire for downstream tasks. This work investigates whether MMPLMs can be applied to clinical domain machine translation (MT) towards entirely unseen languages via transfer learning. We carry out an experimental investigation using Meta-AI's MMPLMs ``wmt21-dense-24-wide-en-X and X-en (WMT21fb)'' which were pre-trained on 7 language pairs and 14...

Last update from database: 30/10/2025, 05:22 (UTC)
Powered by Zotero and Kerko.