TY - GEN
T1 - Diverse Machine Translation with Translation Memory
AU - Zhang, Yi
AU - Zhao, Jing
AU - Sun, Shiliang
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - The challenge of diverse machine translation is to ensure both diversity and quality, in which diversity desires the distinction between multiple hypotheses in the syntactic or word level, while quality requires hypotheses to be consistent with certain references. Existing work on diverse machine translation, most of which boosts translation diversity but compromises trans-lation quality, either resorts to special search strategies to realize word-level diversity or trains several decoders simultaneously to enable possible syntactic multiplicity. In this work, we propose to improve the translation diversity without a severe drop in quality through constructing a novel translation memory based NMT and designing a global diverse beam search strategy. Specifically, the translation memory retrieved from target-size corpus act as experts to interact with standard NMT, which not only generates various hypotheses, but also enhances the quality to a great degree. Moreover, we exploit a novel diverse beam search to further avoid token reuse across different hypotheses and improve local diversity. Experiments on two benchmarks (JRC-Acquis and WMT) demonstrate that our approaches achieve a compelling promotion both of translation quality and diversity compared with other diverse approaches.
AB - The challenge of diverse machine translation is to ensure both diversity and quality, in which diversity desires the distinction between multiple hypotheses in the syntactic or word level, while quality requires hypotheses to be consistent with certain references. Existing work on diverse machine translation, most of which boosts translation diversity but compromises trans-lation quality, either resorts to special search strategies to realize word-level diversity or trains several decoders simultaneously to enable possible syntactic multiplicity. In this work, we propose to improve the translation diversity without a severe drop in quality through constructing a novel translation memory based NMT and designing a global diverse beam search strategy. Specifically, the translation memory retrieved from target-size corpus act as experts to interact with standard NMT, which not only generates various hypotheses, but also enhances the quality to a great degree. Moreover, we exploit a novel diverse beam search to further avoid token reuse across different hypotheses and improve local diversity. Experiments on two benchmarks (JRC-Acquis and WMT) demonstrate that our approaches achieve a compelling promotion both of translation quality and diversity compared with other diverse approaches.
KW - diverse beam search
KW - diverse machine transaltion
KW - transaltion memory
UR - https://www.scopus.com/pages/publications/85140764509
U2 - 10.1109/IJCNN55064.2022.9892899
DO - 10.1109/IJCNN55064.2022.9892899
M3 - 会议稿件
AN - SCOPUS:85140764509
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Joint Conference on Neural Networks, IJCNN 2022
Y2 - 18 July 2022 through 23 July 2022
ER -