Abstract
As a critical component of natural language processing, text sentiment analysis has taken on an increasingly important role in areas such as recommendation systems, customer evaluation analysis and opinion referencing. In order to more accurately extract the sentiment embedded in text from text information, this paper proposes a BTB model, which is a text sentiment classification model incorporating BERT, TextCNN, and BILSTM. In the BTB model, the generated word vector has more a priori information, and the important features of the text and contextual information are efficiently used in the sentiment analysis. The results of experiments show that the BTB model achieves high precision, recall and F1 value, and has good generalisation ability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bengio, Y., Ducharme, R., Vincent, P., et al.: A neural probabilistic language mode. J. Mach. Learn. Res. 3, 1137–1155 (2003)
Bengio, Y., Senécal, J.S.: Adaptive importance sampling to accelerate training of a neural probabilistic language model. IEEE Trans. Neural Networks 19(4), 713–722 (2008)
Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, Lake Tahoe, Nevada, USA, pp. 3111–3119. NIPS (2013)
Mikolov, T., Corrado, G., Chen, K., et al.: Efficient estimation of word representations in vector space. In: Proceedings of the International Conference on Learning Representations, Scottsdale, Arizona, USA, pp. 1–12. ICLR (2013)
Joulin, A., Grave, E., Bojanowski, P., et al.: Bag of tricks for efficient text classification. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, pp. 427–431. EACL (2016)
Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, pp. 1746–1751. EMNLP (2014)
Zaremba, W., Sutskever, I., Vinyals, O.: Recurrent neural network regularization. Eprint Arxiv, arXiv:1409.2329 (2014)
Lai, S., Xu, L., Liu, K., et al.: Recurrent convolutional neural networks for text classification. In: Twenty-Ninth AAAI Conference on Artificial Intelligence, Austin, TX, USA, pp. 2267–2273. AAAI (2015)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Yang, Z., Yang, D., Dyer, C., et al.: Hierarchical attention networks for document classification. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, USA, pp. 1480–1489. NAACL (2016)
Peters, M., Neumann, M., Iyyer, M., et al.: Deep contextualized word representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers) ( 2018)
Devlin, J., Chang, M.W., Lee, K., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Schuster, M., Paliwal, K.K, et al.: Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 45, 2673–2681 (1997)
Sun, M.S., Li, J.Y., Guo, Z.P., et al.: THUCTC: an efficient toolkit for Chinese text classification.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zou, S., Zhang, M., Zong, X., Zhou, H. (2022). Text Sentiment Analysis Based on BERT-TextCNN-BILSTM. In: Liu, Q., Liu, X., Cheng, J., Shen, T., Tian, Y. (eds) Proceedings of the 12th International Conference on Computer Engineering and Networks. CENet 2022. Lecture Notes in Electrical Engineering, vol 961. Springer, Singapore. https://doi.org/10.1007/978-981-19-6901-0_136
Download citation
DOI: https://doi.org/10.1007/978-981-19-6901-0_136
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-6900-3
Online ISBN: 978-981-19-6901-0
eBook Packages: Computer ScienceComputer Science (R0)