Abstract
Attention mechanism has been proved to be able to improve the quality of neural machine translation by selectively focusing on partial words of a source sentence during translation process. Attention mechanism usually focuses on local attention by using solely the linear index distance of words while ignores syntax structures of sentences. In this paper, we extend local attention through syntax distance constraint, and propose an attention mechanism based on a new syntactic branch distance, which simultaneously pays attention to words with similar linear index distances and syntax-related words. Based on the English-to-German translation task, experiment results showed that our model outperforms a recent baseline method with an improvement of 1.61 BLEU points, demonstrating the effectiveness of the proposed model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Kalchbrenner, N., Blunsom, P.: Recurrent continuous translation models. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1700–1709 (2013)
Cho, K., Merrienboer, B.V., Gülçehre, Ç., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)
Sutskever, I., Vinyals, O., Le, Q.V.: Sutskever, I., et al.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2015)
Luong, M.T., Sutskever, I., Le, Q.V., et al.: Addressing the rare word problem in neural machine translation. Bull. Univ. Agric. Sci. Vet. Med. Cluj-Napoca. Vet. Med. 27(2), 82–86 (2014)
Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. arXiv preprint arXiv:1503.00075 (2015)
Sennrich, R., Haddow, B.: Linguistic input features improve neural machine translation. In: Proceedings of the First Conference on Machine Translation, Berlin, Germany, pp. 83–91. ACL (2016)
Eriguchi, A., Tsuruoka, Y., Cho, K.: Learning to parse and translate improves neural machine translation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, pp. 72–78. ACL (2017)
Li, J., Xiong, D., Tu, Z., Zhu, M., Zhou, G.: Modeling source syntax for neural machine translation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, pp. 688–697. ACL (2017)
Wu, S., Zhang, D., Yang, N., Li, M., Zhou, M.: Sequence-to-dependency neural machine translation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1), Vancouver, Canada, pp. 698–707 (2017)
Eriguchi, A., Hashimoto, K., Tsuruoka, Y.: Tree-to-sequence attentional neural machine translation. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, pp. 823–833 (2016)
Chen, H., Huang, S., Chiang, D., Chen, J.: Improved neural machine translation with a syntax-aware encoder and decoder. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, pp. 1936–1945 (2017)
Wu, S., Zhou, M., Zhang, D.: Improved neural machine translation with source syntax. In: Proceedings of the Twenty Sixth International Joint Conference on Artificial Intelligence, IJCAI-2017, pp. 4179–4185 (2017)
Chen, K., Wang, R., Utiyama, M., Liu, L., Zhao, T., et al.: Neural machine translation with source dependency representation. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, pp. 23–32 (2017)
Zeiler M.D.: ADADELTA: an adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012)
Sennrich, R., Firat, O., Cho, K., et al.: Nematus: a toolkit for neural machine translation. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, pp. 65–68 (2017)
Chang, P.C., Tseng, H., Jurafsky, D., Manning, C.D.: Discriminative reordering with Chinese grammatical relations features. In: Proceedings of the Third Workshop on Syntax and Structure in Statistical Translation, Boulder, Colorado, pp. 51–59 (2009)
Hinton, G., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580 (2012)
Papineni, K., Roukos, S., Ward, T., Zhu, W.: BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, Philadelphia, Pennsylvania, USA, pp. 311–318 (2002)
Wang, X., Pham, H., Yin, P., Neubig, G.: A tree-based decoder for neural machine translation. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4772–4777(2018)
Ahmed, M., Samee, M. R., Mercer, R. E.: Improving tree-LSTM with tree attention. In: Proceedings of the 2019 IEEE 13th International Conference on Semantic Computing, pp. 247–254(2019)
Acknowledgements
This work was supported by National Natural Science Foundation of China (No.61772146).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Peng, R., Chen, Z., Hao, T., Fang, Y. (2019). Neural Machine Translation with Attention Based on a New Syntactic Branch Distance. In: Huang, S., Knight, K. (eds) Machine Translation. CCMT 2019. Communications in Computer and Information Science, vol 1104. Springer, Singapore. https://doi.org/10.1007/978-981-15-1721-1_5
Download citation
DOI: https://doi.org/10.1007/978-981-15-1721-1_5
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-1720-4
Online ISBN: 978-981-15-1721-1
eBook Packages: Computer ScienceComputer Science (R0)