[ad_1]
 
View a PDF of the paper titled OpenWHO: A Document-Level Parallel Corpus for Health Translation in Low-Resource Languages, by Rapha\”el Merx and 3 other authors
Abstract:In machine translation (MT), health is a high-stakes domain characterised by widespread deployment and domain-specific vocabulary. However, there is a lack of MT evaluation datasets for low-resource languages in this domain. To address this gap, we introduce OpenWHO, a document-level parallel corpus of 2,978 documents and 26,824 sentences from the World Health Organization’s e-learning platform. Sourced from expert-authored, professionally translated materials shielded from web-crawling, OpenWHO spans a diverse range of over 20 languages, of which nine are low-resource. Leveraging this new resource, we evaluate modern large language models (LLMs) against traditional MT models. Our findings reveal that LLMs consistently outperform traditional MT models, with Gemini 2.5 Flash achieving a +4.79 ChrF point improvement over NLLB-54B on our low-resource test set. Further, we investigate how LLM context utilisation affects accuracy, finding that the benefits of document-level translation are most pronounced in specialised domains like health. We release the OpenWHO corpus to encourage further research into low-resource MT in the health domain.
Submission history
 From: Raphael Merx [view email]      
 [v1]
        Fri, 22 Aug 2025 02:53:56 UTC (169 KB)
 [v2]
        Tue, 16 Sep 2025 05:10:52 UTC (169 KB)
 [v3]
        Fri, 19 Sep 2025 03:20:15 UTC (169 KB)
 [v4]
        Tue, 23 Sep 2025 02:28:48 UTC (170 KB)
Source link 
 
 #DocumentLevel #Parallel #Corpus #Health #Translation #LowResource #Languages
 
 [ad_2]
 
 








