Advertisement

A Multi-emotion Classification Method Based on BLSTM-MC in Code-Switching Text

  • Tingwei Wang
  • Xiaohua YangEmail author
  • Chunping Ouyang
  • Aodong Guo
  • Yongbin Liu
  • Zhixing Li
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11109)

Abstract

Most of the previous emotion classifications are based on binary or ternary classifications, and the final emotion classification results contain only one type of emotion. There is little research on multi-emotional coexistence, which has certain limitations on the restoration of human’s true emotions. Aiming at these deficiencies, this paper proposes a Bidirectional Long-Short Term Memory Multiple Classifiers (BLSTM-MC) model to study the five classification problems in code-switching text, and obtains text contextual relations through BLSTM-MC model. It fully considers the relationship between different emotions in a single post, at the same time, the Attention mechanism is introduced to find the importance of different features and predict all emotions expressed by each post. The model achieved third place in all submissions in the conference NLP&&CC_task1 2018.

Keywords

Multiple emotion classification Code-switching texts Attention mechanism BLSTM multiple classifiers 

Notes

Acknowledgements

This research work is supported by National Natural Science Foundation of China (No. 61402220, No. 61502221), the Philosophy and Social Science Foundation of Hunan Province (No. 16YBA323), the Double First Class Construct Program of USC (2017SYL16), scientific and technological research program of Chongqing municipal education commission (No. KJ1500438), basic and frontier research project of Chongqing, China (No. cstc2015jcyjA40018).

References

  1. 1.
    Parrott, W.: Emotions in Social Psychology: Essential Readings, pp. 1–392. Psychology Press, Philadelphia (2001)Google Scholar
  2. 2.
    Tang, D., Qin, B., Liu, T.: Deep Learning for sentiment analysis: successful approaches and future challenges. Wiley Interdisc. Rev. Data Min. Knowl. Discov. 5(6), 292–303 (2015)CrossRefGoogle Scholar
  3. 3.
    Socher, R., Huval, B., Manning, C.D., et al.: Semantic compositionality through recursive matrix-vector spaces. In: Proceedings of the Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 1201–1211. MIT Press, Cambridge (2012)Google Scholar
  4. 4.
    Kim, Y.: Convolutional neural networks for sentence classification (2014). arXiv:1408.5882
  5. 5.
    Zhu, X., Sobihani, P., Guo, H.: Long short-term memory over recursive structures. In: Proceedings of the International Conference on Machine Learning, pp. 1604–1612. ACM, New York (2015)Google Scholar
  6. 6.
    Cheng, J.: Research on two-dimensional LSTM model based on attention mechanism in sentiment classification of Chinese commodity reviews. Softw. Eng. 20(11), 4–6 (2017)Google Scholar
  7. 7.
    Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. In: Proceedings of NIPS, pp. 3111–3119 (2013)Google Scholar
  8. 8.
    Collobert, R., Weston, J., Bottou, L., et al.: Natural language processing (Almost) from scratch. J. Mach. Learn. Res. 12(1), 2493–2537 (2011)zbMATHGoogle Scholar
  9. 9.
    Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences, p. 1. Eprint Arxiv (2014)Google Scholar
  10. 10.
    Lai, S., Xu, L., Liu, K., et al.: Recurrent convolutional neural network for text classification. In: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, pp. 2267–2273 (2015)Google Scholar
  11. 11.
    Teng, Z., Vo, D.T., Zhang, Y.: Context-sensitive lexicon features for neural sentiment analysis. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 1629–1638 (2016)Google Scholar
  12. 12.
    Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)CrossRefGoogle Scholar
  13. 13.
    Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: continual prediction with LSTM. Neural Comput. 12(10), 2451–2471 (2000)CrossRefGoogle Scholar
  14. 14.
    Mnih, V., Heess, N., Graves, A., et al.: Recurrent models of visual attention. In: Advances in Neural Information Processing Systems 27 (NIPS), pp. 2204–2212 (2014)Google Scholar
  15. 15.
    Xu, K., Ba, J., Kiros, R., et al.: Show, attend and tell: neural image caption generational with visual attention. In: Proceedings of the 32nd International Conference on Machine Learning (ICML), pp. 2048–2057 (2015)Google Scholar
  16. 16.
    Zheng, X., Ding, L., Wan, R.: Hierarchical BGRU model based on user and product attention mechanism. Comput. Eng. Appl., 27 May 2017Google Scholar
  17. 17.
    Zhang, Y., Jiang, Q.: Textual sentiment analysis based on two LSTM structures. Software 1, 116–120 (2018)Google Scholar
  18. 18.
    Liu, Y., Ouyang, C., Li, J.: Ensemble method to joint inference for knowledge extraction. Expert Syst. Appl. 83, 114–121 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Tingwei Wang
    • 1
  • Xiaohua Yang
    • 1
    Email author
  • Chunping Ouyang
    • 1
  • Aodong Guo
    • 2
  • Yongbin Liu
    • 1
  • Zhixing Li
    • 3
  1. 1.School of ComputerUniversity of South ChinaHengyangChina
  2. 2.College of Information EngineeringXinhua UniversityHefeiChina
  3. 3.Chongqing University of Posts and TelecommunicationsChongqingChina

Personalised recommendations