Advertisement

Fast and Effective Neural Networks for Translating Natural Language into Denotations

  • Tiago PimentelEmail author
  • Juliano Viana
  • Adriano Veloso
  • Nivio Ziviani
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11147)

Abstract

In this paper we study the semantic parsing problem of mapping natural language utterances into machine interpretable meaning representations. We consider a text-to-denotation application scenario in which a user interacts with a non-human assistant by entering a question, which is then translated into a logical structured query and the result of running this query is finally returned as response to the user. We propose encoder-decoder models that are trained end-to-end using the input questions and the corresponding logical structured queries. In order to ensure fast response times, our models do not condition the target string generation on previously generated tokens. We evaluate our models on real data obtained from a conversational banking chat service, and we show that conditionally-independent translation models offer similar accuracy numbers when compared with sophisticate translation models and present one order of magnitude faster response times.

Notes

Acknowledgements

We thank the partial support given by the Project: Models, Algorithms and Systems for the Web (grant FAPEMIG/PRONEX/MASWeb APQ-01400-14), and authors’ individual grants and scholarships from CNPq and Kunumi.

References

  1. Andreas, J., Vlachos, A., Clark, S.: Semantic parsing as machine translation. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, pp. 47–52 (2013)Google Scholar
  2. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: International Conference on Learning Representations (2015)Google Scholar
  3. Berant, J., Chou, A., Frostig, R., Liang, P.: Semantic parsing on freebase from question-answer pairs. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1533–1544 (2013)Google Scholar
  4. Bonferroni, C.: Sulle medie multiple di potenze. Boll. dell’Unione Mat. Ital. 5(3–4), 267–270 (1950)MathSciNetzbMATHGoogle Scholar
  5. Clevert, D.-A., Unterthiner, T., Hochreiter, S.: Fast and accurate deep network learning by exponential linear units (ELUs). In: International Conference on Learning Representations (2016)Google Scholar
  6. Dong, L., Lapata, M.: Language to logical form with neural attention. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (2016)Google Scholar
  7. Gehring, J., Auli, M., Grangier, D., Dauphin, Y.N.: A convolutional encoder model for neural machine translation. CoRR, abs/1611.02344 (2016). http://arxiv.org/abs/1611.02344
  8. Gehring, J., Auli, M., Grangier, D., Yarats, D., Dauphin, Y.N.: Convolutional sequence to sequence learning. CoRR, abs/1705.03122 (2017). http://arxiv.org/abs/1705.03122
  9. Hinton, G.E.: A practical guide to training restricted boltzmann machines. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 599–619. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-35289-8_32CrossRefGoogle Scholar
  10. Jia, R., Liang, P.: Data recombination for neural semantic parsing. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (2016)Google Scholar
  11. Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pp. 655–665 (2014)Google Scholar
  12. Kalchbrenner, N., Espeholt, L., Simonyan, K., van den Oord, A., Graves, A., Kavukcuoglu, K.: Neural machine translation in linear time. CoRR, abs/1610.10099 (2016). http://arxiv.org/abs/1610.10099
  13. Khani, F., Rinard, M.C., Liang, P.: Unanimous prediction for 100% precision with application to learning semantic mappings. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (2016)Google Scholar
  14. Liang, P.: Talking to computers in natural language. ACM Crossroads 21(1), 18–21 (2014)CrossRefGoogle Scholar
  15. Popescu, A.-M., Etzioni, O., Kautz, H.A.: Towards a theory of natural language interfaces to databases. In: Proceedings of the 8th International Conference on Intelligent User Interfaces, pp. 149–157 (2003)Google Scholar
  16. Ranzato, M.A., Chopra, S., Auli, M., Zaremba, W.: Sequence level training with recurrent neural networks. In: International Conference on Learning Representations (2016)Google Scholar
  17. Shazeer, N., et al.: Outrageously large neural networks: the sparsely-gated mixture-of-experts layer. In: International Conference on Learning Representations (2017)Google Scholar
  18. Shelhamer, E., Long, J., Darrell, T.: Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 39(4), 640–651 (2017)CrossRefGoogle Scholar
  19. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, pp. 3104–3112 (2014)Google Scholar
  20. Vaswani, A., et al.: Attention is all you need. CoRR, abs/1706.03762 (2017). http://arxiv.org/abs/1706.03762
  21. Yang, Z., Chen, W., Wang, F., Xu, B.: Improving neural machine translation with conditional sequence generative adversarial nets. CoRR, abs/1703.04887 (2017). http://arxiv.org/abs/1703.04887
  22. Yu, F., Koltun, V.: Multi-scale context aggregation by dilated convolutions. CoRR, abs/1511.07122 (2015). http://arxiv.org/abs/1511.07122

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Tiago Pimentel
    • 1
    • 2
    Email author
  • Juliano Viana
    • 2
  • Adriano Veloso
    • 1
  • Nivio Ziviani
    • 1
    • 2
  1. 1.CS DepartmentUniversidade Federal de Minas GeraisBelo HorizonteBrazil
  2. 2.KunumiBelo HorizonteBrazil

Personalised recommendations