Joint Training for Neural Machine Translation

Cheng, Yong.

Joint Training for Neural Machine Translation [electronic resource] / by Yong Cheng. - 1st ed. 2019. - XIII, 78 p. 23 illus., 9 illus. in color. online resource. - Springer Theses, Recognizing Outstanding Ph.D. Research, 2190-5061 . - Springer Theses, Recognizing Outstanding Ph.D. Research, .

1. Introduction -- 2. Neural Machine Translation -- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation -- 4. Semi-supervised Learning for Neural Machine Translation -- 5. Joint Training for Pivot-based Neural Machine Translation -- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning -- 7. Related Work -- 8. Conclusion.

This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.

9789813297487

10.1007/978-981-32-9748-7 doi


Natural language processing (Computer science).
Logic programming.
Natural Language Processing (NLP).
Logic in AI.

QA76.9.N38

006.35
© 2024 IIIT-Delhi, library@iiitd.ac.in