Advertisement

Attention Aware Bidirectional Gated Recurrent Unit Based Framework for Sentiment Analysis

  • Zhengxi Tian
  • Wenge Rong
  • Libin Shi
  • Jingshuang Liu
  • Zhang Xiong
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11061)

Abstract

Sentiment analysis is an effective technique and widely employed to analyze sentiment polarity of reviews and comments on the Internet. A lot of advanced methods have been developed to solve this task. In this paper, we propose an attention aware bidirectional GRU (Bi-GRU) framework to classify the sentiment polarity from the aspects of sentential-sequence modeling and word-feature seizing. It is composed of a pre-attention Bi-GRU to incorporate the complicated interaction between words by sentence modeling, and an attention layer to capture the keywords for sentiment apprehension. Afterward, a post-attention GRU is added to imitate the function of decoder, aiming to extract the predicted features conditioned on the above parts. Experimental study on commonly used datasets has demonstrated the proposed framework’s potential for sentiment classification.

Keywords

Sentiment analysis Bidirectional GRU Attention 

Notes

Acknowledgments

This work was partially supported by the National Natural Science Foundation of China (No. 61332018), and the Fundamental Research Funds for the Central Universities.

References

  1. 1.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. CoRR abs/1409.0473 (2014)Google Scholar
  2. 2.
    Bengio, Y., Simard, P.Y., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)CrossRefGoogle Scholar
  3. 3.
    Cho, K., van Merrienboer, B., Bahdanau, D., Bengio, Y.: On the properties of neural machine translation: Encoder-decoder approaches. In: Proceedings of 8th Workshop on Syntax, Semantics and Structure in Statistical Translation, pp. 103–111 (2014)Google Scholar
  4. 4.
    Chung, J., Gülçehre, Ç., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. CoRR abs/1412.3555 (2014)Google Scholar
  5. 5.
    Ding, X., Liu, B., Yu, P.S.: A holistic lexicon-based approach to opinion mining. In: Proceedings of 2008 International Conference on Web Search and Web Data Mining, pp. 231–240 (2008)Google Scholar
  6. 6.
    Goodfellow, I., Bengio, Y., Courville, A., Bengio, Y.: Deep Learning, vol. 1. MIT press, Cambridge (2016)zbMATHGoogle Scholar
  7. 7.
    He, L., Lee, K., Lewis, M., Zettlemoyer, L.: Deep semantic role labeling: what works and what’s next. In: Proceedings of 55th Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 473–483 (2017)Google Scholar
  8. 8.
    Hermann, K.M., et al.: Teaching machines to read and comprehend. In: Proceedings of 2015 Annual Conference on Neural Information Processing Systems, pp. 1693–1701 (2015)Google Scholar
  9. 9.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  10. 10.
    Kudo, T., Matsumoto, Y.: Chunking with support vector machines. In: Proceedings of 2nd Meeting of the North American Chapter of the Association for Computational Linguistics (2001)Google Scholar
  11. 11.
    Lafferty, J.D., McCallum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: Proceedings of 18th International Conference on Machine Learning, pp. 282–289 (2001)Google Scholar
  12. 12.
    Liu, J., Rong, W., Tian, C., Gao, M., Xiong, Z.: Attention aware semi-supervised framework for sentiment analysis. In: Lintas, A., Rovetta, S., Verschure, P.F.M.J., Villa, A.E.P. (eds.) ICANN 2017. LNCS, vol. 10614, pp. 208–215. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-68612-7_24CrossRefGoogle Scholar
  13. 13.
    Lu, Y., Salem, F.M.: Simplified gating in long short-term memory (LSTM) recurrent neural networks. CoRR abs/1701.03441 (2017)Google Scholar
  14. 14.
    van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008)zbMATHGoogle Scholar
  15. 15.
    Mnih, V., Heess, N., Graves, A., Kavukcuoglu, K.: Recurrent models of visual attention. In: Proceedings of 2014 Annual Conference on Neural Information Processing Systems, pp. 2204–2212 (2014)Google Scholar
  16. 16.
    Mousa, A.E., Schuller, B.W.: Contextual bidirectional long short-term memory recurrent neural network language models: a generative approach to sentiment analysis. In: Proceedings of 15th Conference of the European Chapter of the Association for Computational Linguistics, pp. 1023–1032 (2017)Google Scholar
  17. 17.
    Pang, B., Lee, L., Vaithyanathan, S.: Thumbs up? Sentiment classification using machine learning techniques. In: Proceedings of 2002 Conference on Empirical Methods in Natural Language Processing (2002)Google Scholar
  18. 18.
    Pascanu, R., Mikolov, T., Bengio, Y.: On the difficulty of training recurrent neural networks. In: Proceedings of 30th International Conference on Machine Learning, pp. 1310–1318 (2013)Google Scholar
  19. 19.
    Qiu, L., Lei, Q., Zhang, Z.: Advanced sentiment classification of tibetan microblogs on smart campuses based on multi-feature fusion. IEEE Access 6, 17896–17904 (2018)CrossRefGoogle Scholar
  20. 20.
    Raza, K., Alam, M.: Recurrent neural network based hybrid model for reconstructing gene regulatory network. Comput. Biol. Chem. 64, 322–334 (2016)CrossRefGoogle Scholar
  21. 21.
    Schuller, B.W., Mousa, A.E., Vryniotis, V.: Sentiment analysis and opinion mining: on optimal parameters and performances. Wiley Interdisc. Rev. Data Min. Knowl. Disc. 5(5), 255–263 (2015)CrossRefGoogle Scholar
  22. 22.
    Wang, Y., Huang, M., Zhu, X., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)Google Scholar
  23. 23.
    Wang, Y., Pal, A.: Detecting emotions in social media: a constrained optimization approach. In: Proceedings of 24th International Joint Conference on Artificial Intelligence, pp. 996–1002 (2015)Google Scholar
  24. 24.
    Zhang, B., Xiong, D., Su, J.: A GRU-gated attention model for neural machine translation. CoRR abs/1704.08430 (2017)Google Scholar
  25. 25.
    Zhang, M., Zhou, Z.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)CrossRefGoogle Scholar
  26. 26.
    Zhao, J., Liu, K., Xu, L.: Sentiment analysis: mining opinions, sentiments, and emotions. Comput. Linguist. 42(3), 595–598 (2016)CrossRefGoogle Scholar
  27. 27.
    Zhou, D., Zhang, X., Zhou, Y., Zhao, Q., Geng, X.: Emotion distribution learning from texts. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 638–647 (2016)Google Scholar
  28. 28.
    Zhou, J., Xu, W.: End-to-end learning of semantic role labeling using recurrent neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, pp. 1127–1137 (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.School of Computer Science and EngineeringBeihang UniversityBeijingChina
  2. 2.Sino-French Engineer SchoolBeihang UniversityBeijingChina

Personalised recommendations