mtmt
The Hungarian Scientific Bibliography
XML
JSON
Public search
Magyarul
Augmenting Large Language Model Translators via Translation Memories
Mu, Y. ✉
;
Reheman, A. ✉
;
Cao, Z.
;
Fan, Y.
;
Li, B.
;
Li, Y.
;
Xiao, T.
;
Zhang, C.
;
Zhu, J.
English Conference paper (Chapter in Book) Scientific
Published:
Rogers Anna. Findings of the Association for Computational Linguistics: ACL 2023. (2023) ISBN:9781959429623
pp. 10287-10299
Identifiers
MTMT: 34975078
Scopus:
85175230045
Using translation memories (TMs) as prompts is a promising approach to in-context learning of machine translation models. In this work, we take a step towards prompting large language models (LLMs) with TMs and making them better translators. We find that the ability of LLMs to “understand” prompts is indeed helpful for making better use of TMs. Experiments show that the results of a pre-trained LLM translator can be greatly improved by using high-quality TM-based prompts. These results are even comparable to those of the state-of-the-art NMT systems which have access to large-scale in-domain bilingual data and are well tuned on the downstream tasks. © 2023 Association for Computational Linguistics.
Citing (1)
Citation styles:
IEEE
ACM
APA
Chicago
Harvard
CSL
Copy
Print
2025-04-24 12:18
×
Export list as bibliography
Citation styles:
IEEE
ACM
APA
Chicago
Harvard
Print
Copy