TULUN: Transparent and Adaptable Low-resource Machine Translation
Merx, Raphael; Suominen, Hanna; Hong, Lois Yinghui; Thieberger, Nick; Cohn, Trevor; Vylomova, Ekaterina
TULUN: Transparent and Adaptable Low-resource Machine Translation
Merx, Raphael
Suominen, Hanna
Hong, Lois Yinghui
Thieberger, Nick
Cohn, Trevor
Vylomova, Ekaterina
Julkaisun pysyvä osoite on:
https://urn.fi/URN:NBN:fi-fe202601217172
https://urn.fi/URN:NBN:fi-fe202601217172
Tiivistelmä
Machine translation (MT) systems that support low-resource languages often struggle on specialized domains. While researchers have proposed various techniques for domain adaptation, these approaches typically require model fine-tuning, making them impractical for non-technical users and small organizations. To address this gap, we propose TULUN,(1) a versatile solution for terminology-aware translation, combining neural MT with large language model (LLM)-based post-editing guided by existing glossaries and translation memories. Our open-source web-based platform enables users to easily create, edit, and leverage terminology resources, fostering a collaborative human-machine translation process that respects and incorporates domain expertise while increasing MT accuracy. Evaluations show effectiveness in both real-world and benchmark scenarios: on medical and disaster relief translation tasks for Tetun and Bislama, our system achieves improvements of 16.90-22.41 ChrF++ points over baseline MT systems. Across six low-resource languages on the FLORES dataset, TULUN outperforms both standalone MT and LLM approaches, achieving an average improvement of 2.8 ChrF++ points over NLLB-54B. TULUN is publicly accessible at bislama-trans.rapha.dev.
Kokoelmat
- Rinnakkaistallenteet [29337]
