Advertisement

Improved Review Sentiment Analysis with a Syntax-Aware Encoder

  • Jiangfeng Zeng
  • Ming Yang
  • Ke ZhouEmail author
  • Xiao Ma
  • Yangtao Wang
  • Xiaodong Xu
  • Zhili Xiao
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11642)

Abstract

Review sentiment analysis has drawn a lot of active research interest because of the explosive growth in the amount of available reviews in our day-to-day activities. The current review sentiment classification work often models each sentence as a sequence of words, thus simply training sequence-structured recurrent neural networks (RNNs) end-to-end and optimizing via stochastic gradient descent (SGD). However, such sequence-structured architectures overlook the syntactic hierarchy among the words in a sentence. As a result, they fail to capture the syntactic properties that would naturally combine words to phrases. In this paper, we propose to model each sentence of a review with an attention-based dependency tree-LSTM, where a sentence embedding is obtained relying on the dependency tree of the sentence as well as the attention mechanism in the tree structure. Then, we feed all the sentence representations into a sequence-structured long short-term memory network (LSTM) and exploit attention mechanism to generate the review embedding for final sentiment classification. We evaluate our attention-based tree-LSTM model on three public datasets, and experimental results turn out that it outperforms the state-of-the-art baselines.

Keywords

Sentiment analysis Recurrent neural networks tree-LSTM Syntax-aware 

Notes

Acknowledegments

This work was supported in part by the National Natural Science Foundation of China under grants No. 61821003 and the National Key Research and Development Program of China under grant No. 2016YFB0800402.

References

  1. 1.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. CoRR abs/1409.0473 (2014). http://arxiv.org/abs/1409.0473
  2. 2.
    Bhatia, P., Ji, Y., Eisenstein, J.: Better document-level sentiment analysis from RST discourse parsing. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, 17–21 September 2015, Lisbon, Portugal, pp. 2212–2218 (2015)Google Scholar
  3. 3.
    Chen, D., Manning, C.D.: A fast and accurate dependency parser using neural networks. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP 2014, 25–29 October 2014, Doha, Qatar, A meeting of SIGDAT, a Special Interest Group of the ACL, pp. 740–750 (2014)Google Scholar
  4. 4.
    Chen, H., Huang, S., Chiang, D., Chen, J.: Improved neural machine translation with a syntax-aware encoder and decoder. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, Vancouver, Canada, 30 July–4 August, vol. 1, pp. 1936–1945 (2017)Google Scholar
  5. 5.
    Chen, H., Sun, M., Tu, C., Lin, Y., Liu, Z.: Neural sentiment classification with user and product attention. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, 1–4 November 2016, Austin, Texas, USA, pp. 1650–1659 (2016)Google Scholar
  6. 6.
    Cheng, K., Li, J., Tang, J., Liu, H.: Unsupervised sentiment analysis with signed social networks. In: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, 4–9 February 2017, San Francisco, California, USA, pp. 3429–3435 (2017)Google Scholar
  7. 7.
    Diao, Q., Qiu, M., Wu, C., Smola, A.J., Jiang, J., Wang, C.: Jointly modeling aspects, ratings and sentiments for movie recommendation (JMARS). In: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2014, 24–27 August 2014, New York, NY, USA, pp. 193–202 (2014)Google Scholar
  8. 8.
    Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Xu, K.: Adaptive recursive neural network for target-dependent twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014, 22–27 June 2014, Baltimore, MD, USA, vol. 2, pp. 49–54 (2014)Google Scholar
  9. 9.
    Gregor, K., Danihelka, I., Graves, A., Rezende, D.J., Wierstra, D.: DRAW: a recurrent neural network for image generation. In: Proceedings of the 32nd International Conference on Machine Learning, ICML 2015, 6–11 July 2015, Lille, France, pp. 1462–1471 (2015)Google Scholar
  10. 10.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  11. 11.
    Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP 2014, 25–29 October 2014, Doha, Qatar, A meeting of SIGDAT, a Special Interest Group of the ACL, pp. 1746–1751 (2014)Google Scholar
  12. 12.
    Le, Q.V., Mikolov, T.: Distributed representations of sentences and documents. In: Proceedings of the 31th International Conference on Machine Learning, ICML 2014, Beijing, China, 21–26 June 2014, pp. 1188–1196 (2014)Google Scholar
  13. 13.
    Liu, Y., et al.: Deep self-taught hashing for image retrieval. IEEE Trans. Cybern. 49(6), 2229–2241 (2019)CrossRefGoogle Scholar
  14. 14.
    Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, 17–21 September 2015, Lisbon, Portugal, pp. 1412–1421 (2015)Google Scholar
  15. 15.
    Ma, X., Zeng, J., Peng, L., Fortino, G., Zhang, Y.: Modeling multi-aspects within one opinionated sentence simultaneously for aspect-level sentiment analysis. Future Gen. Comput. Syst. 93, 304–311 (2018)CrossRefGoogle Scholar
  16. 16.
    Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: INTERSPEECH 2010, 11th Annual Conference of the International Speech Communication Association, pp. 1045–1048 (2010)Google Scholar
  17. 17.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems 26: 27th Annual Conference on Neural Information Processing Systems 2013. Proceedings of a meeting held December 5–8, 2013, Lake Tahoe, Nevada, United States, pp. 3111–3119 (2013)Google Scholar
  18. 18.
    Mnih, V., Heess, N., Graves, A., Kavukcuoglu, K.: Recurrent models of visual attention. In: Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, 8–13 December 2014, Montreal, Quebec, Canada, pp. 2204–2212 (2014)Google Scholar
  19. 19.
    Mullen, T., Collier, N.: Sentiment analysis using support vector machines with diverse information sources. In: Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing, EMNLP 2004, A meeting of SIGDAT, a Special Interest Group of the ACL, Held in Conjunction with ACL 2004, 25–26 July 2004, Barcelona, Spain, pp. 412–418 (2004)Google Scholar
  20. 20.
    Pang, B., Lee, L., Vaithyanathan, S.: Thumbs up?: sentiment classification using machine learning techniques. In: Proceedings of the 2002 Conference on Empirical Methods in Natural Language Processing, EMNLP 2002, 6–7 July 2002, Philadelphia, PA, USA, vol. 10, pp. 79–86 (2002)Google Scholar
  21. 21.
    Pérez-Rosas, V., Banea, C., Mihalcea, R.: Learning sentiment lexicons in Spanish. In: Proceedings of the Eighth International Conference on Language Resources and Evaluation, LREC 2012, pp. 3077–3081 (2012)Google Scholar
  22. 22.
    Rabinovich, M., Stern, M., Klein, D.: Abstract syntax networks for code generation and semantic parsing. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, 30 July–4 August, Vancouver, Canada, vol. 1, pp. 1139–1149 (2017)Google Scholar
  23. 23.
    Rao, D., Ravichandran, D.: Semi-supervised polarity lexicon induction. In: 12th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference, EACL 2009, 30 March–3 April 2009, Athens, Greece, pp. 675–682 (2009)Google Scholar
  24. 24.
    Rocktäschel, T., Grefenstette, E., Hermann, K.M., Kociský, T., Blunsom, P.: Reasoning about entailment with neural attention. CoRR abs/1509.06664 (2015), http://arxiv.org/abs/1509.06664
  25. 25.
    Socher, R., et al.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, EMNLP 2013, pp. 1631–1642 (2013)Google Scholar
  26. 26.
    Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL 2015, 26–31 July 2015, Beijing, China, vol. 1, pp. 1556–1566 (2015)Google Scholar
  27. 27.
    Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, 17–21 September 2015, Lisbon, Portugal, pp. 1422–1432 (2015)Google Scholar
  28. 28.
    Tang, D., Qin, B., Liu, T.: Learning semantic representations of users and products for document level sentiment classification. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, ACL 2015, 26–31 July 2015, Beijing, China, vol. 1, pp. 1014–1023 (2015)Google Scholar
  29. 29.
    Tang, D., Wei, F., Yang, N., Zhou, M., Liu, T., Qin, B.: Learning sentiment-specific word embedding for Twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014, 22–27 June 2014, Baltimore, MD, USA, vol. 1, pp. 1555–1565 (2014)Google Scholar
  30. 30.
    Wang, Y., Huang, M., Zhu, X., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, 1–4 November 2016, Austin, Texas, USA, pp. 606–615 (2016)Google Scholar
  31. 31.
    Xu, J., Chen, D., Qiu, X., Huang, X.: Cached long short-term memory neural networks for document-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, 1–4 November 2016, Austin, Texas, USA, pp. 1660–1669 (2016)Google Scholar
  32. 32.
    Xu, K., et al.: Show, attend and tell: neural image caption generation with visual attention. In: Proceedings of the 32nd International Conference on Machine Learning, ICML 2015, 6–11 July 2015, Lille, France, pp. 2048–2057 (2015)Google Scholar
  33. 33.
    Yang, Z., Yang, D., Dyer, C., He, X., Smola, A.J., Hovy, E.H.: Hierarchical attention networks for document classification. In: NAACL HLT 2016, The 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 12–17 June 2016, San Diego California, USA, pp. 1480–1489 (2016)Google Scholar
  34. 34.
    Yin, P., Neubig, G.: A syntactic neural model for general-purpose code generation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, 30 July–4 August, Vancouver, Canada, vol. 1, pp. 440–450 (2017)Google Scholar
  35. 35.
    Zeng, J., Ma, X., Zhou, K.: CAAE++: improved CAAE for age progression/regression. IEEE Access 6, 66715–66722 (2018)CrossRefGoogle Scholar
  36. 36.
    Zeng, J., Ma, X., Zhou, K.: Enhancing attention-based LSTM with position context for aspect-level sentiment classification. IEEE Access 7, 20462–20471 (2019)CrossRefGoogle Scholar
  37. 37.
    Zhao, J., Gui, X.: Deep convolution neural networks for Twitter sentiment analysis. IEEE Access (2017)Google Scholar
  38. 38.
    Zhou, K., Zeng, J., Liu, Y., Zou, F.: Deep sentiment hashing for text retrieval in social CIoT. Future Gen. Comput. Syst. 86, 362–371 (2018)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jiangfeng Zeng
    • 1
  • Ming Yang
    • 2
  • Ke Zhou
    • 1
    Email author
  • Xiao Ma
    • 3
  • Yangtao Wang
    • 1
  • Xiaodong Xu
    • 1
  • Zhili Xiao
    • 4
  1. 1.Huazhong University of Science and TechnologyWuhanChina
  2. 2.Wuhan cciisoft Co., Ltd.WuhanChina
  3. 3.Zhongnan University of Economics and LawWuhanChina
  4. 4.Tencent Inc.ShenzhenChina

Personalised recommendations