Abstract
Non-Autoregressive neural machine translation (NAT) not only achieves rapid training but also actualizes fast decoding. However, the implementation of parallel decoding is at the expense of quality. Due to the increase of speed, the dependence on the context of the target side is discarded which resulting in the loss of the translation contextual position perception ability. In this paper, we improve the model by adding capsule network layers to extract positional information more effectively and comprehensively, that is, relying on vector neurons to compensate for the defects of traditional scalar neurons to store the position information of a single segment. Besides, word-level error correction on the output of NAT model is used to optimize generated translation. Experiments show that our model is superior to the previous model, with a BLEU score of 26.12 on the WMT2014 En-De task and a BLEU score of 31.93 on the WMT16 Ro-En, and the speed is even more than six times faster than the autoregressive model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
Source code of this work is available at https://github.com/salesforce/nonauto-nmt.
- 3.
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, 8–13 December 2014
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, Long Beach, CA, USA, 4–9 December 2017, pp. 5998–6008 (2017)
Gu, J., Bradbury, J., Xiong, C., Li, V.O.K., Socher, R.: Non-autoregressive neural machine translation. In: 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, 30 April–3 May 2018
Hinton, G.E., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network. CoRR (2015). http://arxiv.org/abs/1503.02531
Wang, Y., Tian, F., He, D., Qin, T., Zhai, C., Liu, T.: Non-autoregressive machine translation with auxiliary regularization. In: The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, pp. 5377–5384 (2019)
Lee, J., Mansimov, E., Cho, K.: Deterministic non-autoregressive neural sequence modeling by iterative refinement. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October–4 November 2018, pp. 1173–1182. https://www.aclweb.org/anthology/D18-1149
Ghazvininejad, M., Levy, O., Liu, Y., Zettlemoyer, L.: Mask-predict: parallel decoding of conditional masked language models. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP (2019)
Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, pp. 3856–3866 (2017)
Zhou, L., Zhang, J., Zong, C.: Synchronous bidirectional neural machine translation. TACL 7, 91–105 (2019)
Shu, R., Lee, J., Nakayama, H.: Latent-variable non-autoregressive neural machine translation with deterministic inference using a delta posterior. CoRR (2019)
Ma, X., Zhou, C., Li, X., Neubig, G.: Flowseq: non-autoregressive conditional sequence generation with generative flow. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Guo, Z., Hou, H., Wu, N., Sun, S. (2020). Word-Level Error Correction in Non-autoregressive Neural Machine Translation. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Communications in Computer and Information Science, vol 1332. Springer, Cham. https://doi.org/10.1007/978-3-030-63820-7_83
Download citation
DOI: https://doi.org/10.1007/978-3-030-63820-7_83
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63819-1
Online ISBN: 978-3-030-63820-7
eBook Packages: Computer ScienceComputer Science (R0)