Advertisement

Aspect-Level Sentiment Classification with Conv-Attention Mechanism

  • Qian Yi
  • Jie Liu
  • Guixuan Zhang
  • Shuwu Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11304)

Abstract

The aim of aspect-level sentiment classification is to identify the sentiment polarity of a sentence about a target aspect. Existing methods model the context sequence with recurrent network and employ attention mechanism to generate aspect-specific representations. In this paper, we introduce a novel mechanism called Conv-Attention, which can model the sequential information of context words and generate the aspect-specific attention at the same time via a convolution operation. Based on the new mechanism, we design a new framework for aspect-level sentiment classification called Conv-Attention Network (CAN). Compared to the previous attention-based recurrent models, the Conv-Attention Network can compute much faster. Extensive experimental results show that our model achieves the state-of-the-art performance while saving considerable time in model training and inferring.

Keywords

Sentiment analysis Convolution neural network Attention 

Notes

Acknowledgments

This work has been supported by the National Key R&D Program of China under Grant NO.2017YFB1401000 and the Key Laboratory of Digital Rights Services, which is one of the National Science and Standardization Key Labs for Press and Publication Industry.

References

  1. 1.
    Pang, B., Lee, L.: Opinion mining and sentiment analysis. J. Found. Trends® Inf. Retrieval 2, 1–135 (2008)CrossRefGoogle Scholar
  2. 2.
    Tang, D., Qin, B., Feng, X., Liu, T.: Effective LSTMs for target-dependent sentiment classification. In: International Conference on Computational Linguistics, pp. 3298–3307 (2016)Google Scholar
  3. 3.
    Tang, D., Qin, B., Liu, T.: Aspect level sentiment classification with deep memory network. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 214–224 (2016)Google Scholar
  4. 4.
    Chen, P., Sun, Z., Bing, L., Yang, W.: Recurrent attention network on memory for aspect sentiment analysis. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 452–461 (2017)Google Scholar
  5. 5.
    Jiang, L., Yu, M., Zhou, M., Liu, X., Zhao, T.: Target-dependent twitter sentiment classification. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, pp. 151–160 (2011)Google Scholar
  6. 6.
    Mohammad, S. M., Kiritchenko, S., Zhu, X.: NRC-Canada: building the state-of-the-art in sentiment analysis of tweets. arXiv preprint arXiv:1308.6242 (2013)
  7. 7.
    Deng, L., Wiebe, J.: Joint prediction for entity/event-level sentiment analysis using probabilistic soft logic models. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 179–189 (2015)Google Scholar
  8. 8.
    Ganapathibhotla, M., Liu, B.: Mining opinions in comparative sentences. In: Proceedings of the 22nd International Conference on Computational Linguistics, pp. 241–248 (2008)Google Scholar
  9. 9.
    Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., Xu, K.: Adaptive recursive neural network for target-dependent twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pp. 49–54 (2014)Google Scholar
  10. 10.
    Socher, R., et al.: Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1631–1642 (2013)Google Scholar
  11. 11.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
  12. 12.
    Wang, Y., Huang, M., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)Google Scholar
  13. 13.
    Ma, D., Li, S., Zhang, X., Wang, H.: Interactive attention networks for aspect-level sentiment classification. In: International Joint Conference on Artificial Intelligence, pp. 4068–4074 (2017)Google Scholar
  14. 14.
    Ma, Y., Peng, H., Cambria, E.: Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM. In: AAAI (2018)Google Scholar
  15. 15.
    Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
  16. 16.
    Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1422–1432 (2015)Google Scholar
  17. 17.
    Gehring, J., Auli, M., Grangier, D., Yarats, D., Dauphin, Y. N.: Convolutional sequence to sequence learning. In: International Conference on Machine Learning, pp. 1243–1252 (2017)Google Scholar
  18. 18.
    Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
  19. 19.
    Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1532–1543 (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Qian Yi
    • 1
    • 3
  • Jie Liu
    • 1
    • 2
  • Guixuan Zhang
    • 1
    • 2
  • Shuwu Zhang
    • 1
    • 2
  1. 1.Beijing Engineering Research Center of Digital Content TechnologyInstitute of Automation, Chinese Academy of SciencesBeijingChina
  2. 2.Advanced Innovation Center for Future Visual EntertainmentBeijingChina
  3. 3.University of Chinese Academy of SciencesBeijingChina

Personalised recommendations