Advertisement

DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT

  • Songwen PeiEmail author
  • Lulu Wang
  • Tianma Shen
  • Zhong Ning
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11719)

Abstract

With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classification is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sentiment analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we propose a neural network model that combines deep-attention with Bidirectional Encoder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational dependencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average where the word vector is 300 dimensions in aspect sentiment classification.

Keywords

Aspect sentiment classification BERT Deep-attention Multi-attention Part-of-speech Sentiment analysis Short text 

Notes

Acknowledgements

We would like to thank the anonymous reviewers for their invaluable comments. This work was partially funded by the Shanghai Pujiang Program under Grant 16PJ1407600, the China Post-Doctoral Science Foundation under Grant 2017M610230, and the National Natural Science Foundation of China under Grant 61332009, 61775139, and the Open Project Funding from the State Key Lab of Computer Architecture, ICT, CAS under Grant CARCH201807. Any opinions, findings and conclusions expressed in this paper are those of the authors and do not necessarily reflect the views of the sponsors.

References

  1. 1.
    Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5(1), 1–167 (2012)CrossRefGoogle Scholar
  2. 2.
    Yang, Z., Yang, D., Dyer, C., He, X.: Hierarchical attention networks for document classification. In: Proceedings of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2017)Google Scholar
  3. 3.
    Bhatia, P., Ji, Y., Eisenstein, J.: Better document-level sentiment analysis from RST discourse parsing. Comput. Sci. 2212–2218 (2015)Google Scholar
  4. 4.
    Tai, K.S., Socher, R., Manning, C.D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistic and the 7th International Joint Conference on Natural Language Processing, pp. 1556–1566 (2015)Google Scholar
  5. 5.
    Zhu, X., Sobhani, P., Guo, H.: Long short-term memory over tree structures. In: Proceedings of the 32nd International Conference on Machine Learning, 1604–1612 (2015)Google Scholar
  6. 6.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: Proceedings of International Conference on Learning Representations, pp. 940–1000 (2015)Google Scholar
  7. 7.
    Wang, Y., Huang, M., Zhao, L., Zhu, X.: Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, pp. 606–615 (2016)Google Scholar
  8. 8.
    Dohaiha, H.H., Prasad, P.W.C., Maag, A., Alsadoon, A.: Deep learning for aspect-based sentiment analysis: a comparative review. Expert Syst. Appl. 118, 272–299 (2019)CrossRefGoogle Scholar
  9. 9.
    Wang, B., Liu, M.: Deep learning for aspect-based sentiment analysis [RT]. Stanford University report (2015). http://cs224d.stanford.edu/reports/WangBo.pdf
  10. 10.
    Tang, D., Qin, B., Feng, X.: Effective LSTMs for target-dependent sentiment classification. In: Proceedings of COLINE 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 3298–3307 (2016)Google Scholar
  11. 11.
    Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)Google Scholar
  12. 12.
    Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013)
  13. 13.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, pp. 3104–3112 (2014)Google Scholar
  14. 14.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  15. 15.
    Pei, S., Wang, L.: Study on text sentiment analysis using attention mechanism. Comput. Eng. Sci. 2, 343–354 (2019). (in Chinese)Google Scholar
  16. 16.
    Lin, Z., et al.: A structured self-attentive sentence embedding. arXiv preprint arXiv:1703.03130 (2017)

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Songwen Pei
    • 1
    • 2
    • 3
    Email author
  • Lulu Wang
    • 1
  • Tianma Shen
    • 1
  • Zhong Ning
    • 2
  1. 1.School of Optical-Electrical and Computer EngineeringUniversity of Shanghai for Science and TechnologyShanghaiChina
  2. 2.School of ManagementFudan UniversityShanghaiChina
  3. 3.State Key Laboratory of Computer Architecture, Institute of Computing TechnologyChinese Academy of SciencesBeijingChina

Personalised recommendations