mtmt
Magyar Tudományos Művek Tára
XML
JSON
Átlépés a keresőbe
In English
Rethinking Translation Memory Augmented Neural Machine Translation
Hao, H.
;
Huang, G.
;
Liu, L. ✉
;
Zhang, Z.
;
Shi, S.
;
Wang, R. ✉
Angol nyelvű Konferenciaközlemény (Könyvrészlet) Tudományos
Megjelent:
Rogers Anna. Findings of the Association for Computational Linguistics: ACL 2023. (2023) ISBN:9781959429623
pp. 2589-2605
Azonosítók
MTMT: 34975073
Scopus:
85175487482
This paper rethinks translation memory augmented neural machine translation (TM-augmented NMT) from two perspectives, i.e., a probabilistic view of retrieval and the variance-bias decomposition principle. The finding demonstrates that TM-augmented NMT is good at the ability of fitting data (i.e., lower bias) but is more sensitive to the fluctuations in the training data (i.e., higher variance), which provides an explanation to a recently reported contradictory phenomenon on the same translation task: TM-augmented NMT substantially advances vanilla NMT under the high-resource scenario whereas it fails under the low-resource scenario. Then we propose a simple yet effective TM-augmented NMT model to promote the variance and address the contradictory phenomenon. Extensive experiments show that the proposed TM-augmented NMT achieves consistent gains over both conventional NMT and existing TM-augmented NMT under two variance-preferable (low-resource and plug-and-play) scenarios as well as the high-resource scenario. © 2023 Association for Computational Linguistics.
Idézett közlemények (1)
Hivatkozás stílusok:
IEEE
ACM
APA
Chicago
Harvard
CSL
Másolás
Nyomtatás
2025-04-24 14:05
×
Lista exportálása irodalomjegyzékként
Hivatkozás stílusok:
IEEE
ACM
APA
Chicago
Harvard
Nyomtatás
Másolás