Skip to main content

End-to-End Aspect-Based Sentiment Analysis Based on IDCNN-BLSA Feature Fusion

  • Conference paper
  • First Online:
Knowledge and Systems Sciences (KSS 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1927))

Included in the following conference series:

  • 151 Accesses

Abstract

The existing end-to-end Aspect-Based Sentiment Analysis (ABSA) algorithms focus on feature extraction by a single model, which leads to the loss of the important local or global information. In order to capture both local and global information of sentences, an end-to-end ABSA method based on features fusion is proposed. Firstly, the pre-trained model BERT is applied to obtain word vectors; secondly, Iterated Dilated Convolutions Neural Networks (IDCNN) and Bi-directional Long Short-Term Memory (BiLSTM) with Self-Attention mechanism (BLSA) are adopted to capture local and global features of sentences, and the generated local and context dependency vectors are fused to yield feature vectors. Finally, Conditional Random Fields (CRF) is applied to predict aspect words and sentiment polarity simultaneously. On Laptop14 and Restaurant datasets, our model’s F1 scores increased by 0.51%, 3.11% respectively compared with the best model in the comparison experiment, and 0.74%, 0.78% respectively compared with the single model with the best effect in the ablation experiment. We removed each important module in turn in subsequent experiments and compared it with our model. The experimental results demonstrate the effectiveness of this method in aspect word recognition and its better generalization ability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Yuan, J.H., Wu, Y., Lu, X.: Recent advances in deep learning based sentiment analysis. China Technol. 63(10), 1947–1970 (2020)

    Article  Google Scholar 

  2. Chen, L., Guan, J.L., He, J.H., Peng, J.Y.: Advances in emotion classification research. Comput. Res. Dev. 54(06), 1150–1170 (2017)

    Google Scholar 

  3. Yadollahi, A., Shahraki, A.G., Zaiane, O.R.: Current state of text sentiment analysis is from opinion to emotion mining. ACM Comput. Surv. 50(2), 1–33 (2018)

    Article  Google Scholar 

  4. Meena, A., Prabhakar, T.V.: Sentence level sentiment analysis in the presence of conjuncts using linguistic analysis. In: Amati, G., Carpineto, C., Romano, G. (eds.) ECIR 2007. LNCS, vol. 4425, pp. 573–580. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-71496-5_53

    Chapter  Google Scholar 

  5. Mitchell, M., Aguilar, J., Wilson, T., Durme, B.V: Open domain targeted sentiment. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, USA, pp. 1643–1654. Association for Computational Linguistics (2013)

    Google Scholar 

  6. Tang, D., Qin, B., Feng, X.C., Liu, T.: Effective LSTMs for target-dependent sentiment classification. In: Matsumoto, Y., Prasead, R. (eds.) Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, COLING, Japan, vol. 1311, pp. 3298–3307. The COLING 2016 Organizing Committee (2016). https://doi.org/10.48550/arXiv.1512.01100

  7. Wang, Y.Q., Huang, M., Zhu, X.Y., Zhao, L.: Attention-based LSTM for aspect-level sentiment classification. In: Su, J., Duh, K. (eds.) Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP, vol. 1058, pp. 606–615. Association for Computational Linguistics, Texas (2016). https://doi.org/10.18653/v1/D16-1058

  8. Xing, Y.P., Xiao, C.B., Wu, Y.F., Ding, Z.M.: A convolutional neural network for aspect-level sentiment classification. Int. J. Pattern Recogn. Artif. Intell. 33(14), 46–59 (2019)

    Article  Google Scholar 

  9. Zhang, M., Zhang, Y., Vo, D.T.: Neural networks for open domain targeted sentiment. In: Marquez, L., Callison-Burch, C. (eds.) Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP, Portugal, vol. 1073, pp. 612–621. Association for Computational Linguistics (2015). https://doi.org/10.18653/v1/D15-1073

  10. Li, X., Bing, L., Li, P.J., Lam, W.: A unified model for opinion target extraction and target sentiment prediction. In: Pascal, V.H., Zhang, Z.H. (eds.) Proceedings of the AAAI Conference on Artificial Intelligence, AAAI, USA, vol. 33, pp. 6714–6721. Association for the Advancement of Artificial Intelligence (2019). https://doi.org/10.1609/aaai.v33i01.33016714

  11. Li, X., Bing, L., Li, P.J., Lam, W.: Exploiting BERT for end-to-end aspect-based sentiment analysis. In: 5th Workshop on Noisy User-Generated Text (W-NUT 2019), Hong Kong, pp. 34–41. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/D19-5505

  12. Strubell, E., Verga, P., Belanger, D.: Fast and accurate entity recognition with iterated dilated convolutions. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Denmark, pp. 2670–2680. Association for Computational Linguistics (2017)

    Google Scholar 

  13. Yang, B., Li, J., Wang, D.F.: Context-aware self-attention networks. In: Pascal, V.H., Zhihua, Z. (eds.) Proceedings of the AAAI Conference on Artificial Intelligence, AAAI, USA, vol. 33, pp. 387–394. Association for the Advancement of Artificial Intelligence (2019). https://doi.org/10.1609/aaai.v33i01.3301387

  14. Shang, F.J., Ran, C.F.: An entity recognition model based on deep learning fusion of text feature. Inf. Process. Manag. 59(2), 16–30 (2022)

    Article  Google Scholar 

  15. Yu, F., Koltun, V.: Multi-scale context aggregation by dilated convolutions. In: ICLR (2016). https://doi.org/10.48550/arXiv.1511.07122

  16. Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)

    Article  Google Scholar 

  17. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  18. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by joint learning to align and translate. Comput. Sci. (2014). https://doi.org/10.48550/arXiv:1409.0473

  19. Vaswani, A., Shazeer, N., Parmar, N.: Attention is all you need. In: 31st International Conference on Neural Information Processing Systems. Curran Associates Inc., Long-Beach (2017)

    Google Scholar 

  20. He, K., Zhang, X.Y., Ren, S.H., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2015)

    Google Scholar 

  21. Lafferty, J.D., Mccalum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: 18th International Conference on Machine Learning, pp. 282–289. Morgan Kaufmann Publishers Inc., San Francisco (2001)

    Google Scholar 

  22. Huang, Z.H., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for the sequence tagging. Comput. Sci. (2015). https://doi.org/10.48550/arXiv.1508.01991

  23. Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K.: Neural architectures for named entity recognition. In: Kevin, K., Ani, N. (eds.) Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL, California, vol. 1030, pp. 260–270. Association for Computational Linguistics (2016). https://doi.org/10.18653/v1/N16-1030

  24. Ma, X.Z., Hovy, E.: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, pp. 1064–1074. Association for Computational Linguistics (2016)

    Google Scholar 

Download references

Acknowledgement

The work is supported by National Key Research and Development Program Project “Research on data-driven comprehensive quality accurate service technology for small medium and micro enterprises” (Grant No. 2019YFB1405303).

The Project of Cultivation for Young Top-motch Talents of Beijing Municipal Institutions “Research on the comprehensive quality intelligent service and optimized technology for small medium and micro enterprises” (Grant No. BPHR202203233).

National Natural Science Foundation of China “Research on the influence and governance strategy of online review manipulation with the perspective of E-commerce ecosystem” (Grant No. 72174018).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jindong Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, X., Chen, J., Zhang, W. (2023). End-to-End Aspect-Based Sentiment Analysis Based on IDCNN-BLSA Feature Fusion. In: Chen, J., Huynh, VN., Tang, X., Wu, J. (eds) Knowledge and Systems Sciences. KSS 2023. Communications in Computer and Information Science, vol 1927. Springer, Singapore. https://doi.org/10.1007/978-981-99-8318-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8318-6_4

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8317-9

  • Online ISBN: 978-981-99-8318-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics