Advertisement

Two-Stage Attention Network for Aspect-Level Sentiment Classification

  • Kai Gao
  • Hua Xu
  • Chengliang Gao
  • Xiaomin Sun
  • Junhui Deng
  • Xiaoming Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11304)

Abstract

Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.

Keywords

Attention mechanism LSTM Text representation Aspect-level sentiment classification 

Notes

Acknowledgments

This paper is sponsored by National Science Foundation of China (61673235, 61772075) and National Science Foundation of Hebei Province (F2017208012). It is also sponsored by the Key Research Project for Hebei University of Science & Technology (2016ZDYY03) and Graduated Student Innovation Project of Hebei Province (CXZZSS2017095).

References

  1. 1.
    Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. Computer Science (2015)Google Scholar
  2. 2.
    Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2016)Google Scholar
  3. 3.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
  4. 4.
    Ruder, S., Ghaffari, P., Breslin, J.: A hierarchical model of reviews for aspect-based sentiment analysis. arXiv preprint arXiv:1609.02745 (2016)
  5. 5.
    Tay, Y., Tuan, L. A., Hui, S.: Dyadic memory networks for aspect-based sentiment analysis. In: 2017 International Conference on Information and Knowledge Management, pp. 107–116 (2017)Google Scholar
  6. 6.
    Wang, W., Pan, S.J., Dahlmeier, D., Xiao, X.: Recursive neural conditional random fields for aspect-based sentiment analysis. arXiv preprint arXiv:1603.06679 (2016)
  7. 7.
    Wang, S., Jiang, J.: Machine comprehension using match-LSTM and answer pointer. arXiv preprint arXiv:1608.07905 (2016)
  8. 8.
    Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. Computer Science (2015)Google Scholar
  9. 9.
    Xia, Y., et al.: Deliberation networks: sequence generation beyond one-pass decoding. In: Advances in Neural Information Processing Systems, pp. 1782–1792 (2017)Google Scholar
  10. 10.
    Zhang, L., Wang, S., Liu, B.: Deep learning for sentiment analysis: a survey. Wiley Interdisciplinary Reviews Data Mining & Knowledge Discovery (2018)Google Scholar
  11. 11.
    Wang, Y., Huang, M., Zhao, L., Zhu, X.: Attention-based LSTM for aspect-level sentiment classification. In: 2017 Conference on Empirical Methods on Natural Language Processing, pp. 606–615 (2016)Google Scholar
  12. 12.
    Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. arXiv preprint arXiv:1605.08900 (2016)
  13. 13.
    Lecun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436 (2015)CrossRefGoogle Scholar
  14. 14.
    Qian, Q., Huang, M., Lei, J., Zhu, X.: Linguistically regularized LSTM for sentiment classification. arXiv preprint arXiv:1611.03949 (2016)

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Kai Gao
    • 1
    • 2
  • Hua Xu
    • 1
  • Chengliang Gao
    • 1
    • 2
  • Xiaomin Sun
    • 1
  • Junhui Deng
    • 1
  • Xiaoming Zhang
    • 2
  1. 1.Department of Computer Science and TechnologyTsinghua UniversityBeijingChina
  2. 2.School of Information Science and EngineeringHebei University of Science and TechnologyShijiazhuangChina

Personalised recommendations