Advertisement

Hierarchical Hybrid Attention Networks for Chinese Conversation Topic Classification

  • Yujun Zhou
  • Changliang Li
  • Bo Xu
  • Jiaming Xu
  • Jie Cao
  • Bo Xu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10635)

Abstract

Topic classification is useful for applications such as forensics analysis and cyber-crime investigation. To improve the overall performance on the task of Chinese conversation topic classification, we propose a hierarchical neural network with automatic semantic features selection, which is a hierarchical architecture that depicts the structure of conversations. The model firstly incorporates speaker information into the character- and word-level attentions and generates sentence representation, then uses attention-based BLSTM to construct the conversation representation. Experimental results on three datasets demonstrate that our model achieves better performance than multiple baselines. It indicates that the proposed architecture can capture the informative and salient features related to the meaning of a conversation for topic classification. And we release the dataset of this paper that can be obtained from https://github.com/njoe9/H-HANs.

Keywords

Hierarchical attention networks Chinese conversation Topic classification Recurrent neural networks 

Notes

Acknowledgments

This work is supported by the National Natural Science Foundation (No. 61602479), National High Technology Research and Development Program of China (No. 2015AA015402) and National Key Technology R&D Program of China under No. 2015BAH53F02.

References

  1. 1.
    Orebaugh, A., Allnutt, J.: Classification of instant messaging communications for forensics analysis. Int. J. Forensic Comput. Sci. 1, 22–28 (2009)CrossRefGoogle Scholar
  2. 2.
    Husin, N., Abdullah, M.T., Mahmod, R.: A systematic literature review for topic detection in chat conversation for cyber-crime investigation. Int. J. Digital Content Technol. Appl. 8(3), 22 (2014)Google Scholar
  3. 3.
    Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of the 2015 Conference on EMNLP 2015, pp. 1422–1432 (2015)Google Scholar
  4. 4.
    Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: AAAI 2015, pp. 2267–2273 (2015)Google Scholar
  5. 5.
    Zhou, Y., Xu, B., Xu, J., Yang, L., Li, C., Xu, B.: Compositional recurrent neural networks for Chinese short text classification. In: 2016 IEEE/WIC/ACM International Conference on Web Intelligence, pp. 137–144 (2016)Google Scholar
  6. 6.
    Kinsella, S., Passant, A., Breslin, J.G.: Topic classification in social media using metadata from hyperlinked objects. In: Clough, P., Foley, C., Gurrin, C., Jones, G.J.F., Kraaij, W., Lee, H., Mudoch, V. (eds.) ECIR 2011. LNCS, vol. 6611, pp. 201–206. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-20161-5_20 CrossRefGoogle Scholar
  7. 7.
    Fei, G., Liu, B.: Social media text classification under negative covariate shift. In: EMNLP 2015, pp. 2347–2356 (2015)Google Scholar
  8. 8.
    Lee, K., Palsetia, D., Narayanan, R., Patwary, M.M.A., Agrawal, A., Choudhary, A.N.: Twitter trending topic classification. In: 2011 IEEE 11th International Conference on Data Mining Workshops (ICDMW), pp. 251–258 (2011)Google Scholar
  9. 9.
    Husby, S.D., Barbosa, D.: Topic classification of blog posts using distant supervision. In: Proceedings of the Workshop on Semantic Analysis in Social Media, pp. 28–36. Association for Computational Linguistics (2012)Google Scholar
  10. 10.
    Chen, Y., Li, Z., Nie, L., Hu, X., Wang, X., Chua, T., Zhang, X.: A semi-supervised Bayesian network model for microblog topic classification. In: COLING 2012, pp. 561–576 (2012)Google Scholar
  11. 11.
    Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on EMNLP, pp. 1746–1751 (2014)Google Scholar
  12. 12.
    Zhang, X., Zhao, J., LeCun, Y.: Character-level convolutional networks for text classification. In: NIPS 2015, pp. 649–657 (2015)Google Scholar
  13. 13.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS 2013, pp. 3111–3119 (2013)Google Scholar
  14. 14.
    Martins, A.F.T., Astudillo, R.F.: From softmax to sparsemax: a sparse model of attention and multi-label classification. In: ICML 2016, pp. 1614–1623 (2016)Google Scholar
  15. 15.
    Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: NAACL HLT 2016 (2016)Google Scholar
  16. 16.
    Zhou, Y., Xu, J., Cao, J., Xu, B., Li, C., Xu, B.: Hybrid attention networks for Chinese short text classification. In: CICLing 2017 (2017)Google Scholar
  17. 17.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural Machine Translation by Jointly Learning to Align and Translate. CoRR (2014)Google Scholar
  18. 18.
    Zhang, Y., Er, M.J., Wang, N., Pratama, M.: Attention pooling-based convolutional neural network for sentence modelling. Inf. Sci. 373, 388–403 (2016)CrossRefGoogle Scholar
  19. 19.
    Chen, H., Sun, M., Tu, C., Lin, Y., Liu, Z.: Neural sentiment classification with user and product attention. In: EMNLP 2016, pp. 1650–1659 (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Yujun Zhou
    • 1
    • 2
    • 3
  • Changliang Li
    • 1
  • Bo Xu
    • 1
  • Jiaming Xu
    • 1
  • Jie Cao
    • 1
    • 2
    • 3
  • Bo Xu
    • 1
  1. 1.Institute of AutomationChinese Academy of SciencesBeijingPeople’s Republic of China
  2. 2.University of Chinese Academy of SciencesBeijingPeople’s Republic of China
  3. 3.Jiangsu Jinling Science and Technology Group Co., Ltd.NanjingPeople’s Republic of China

Personalised recommendations