Research on Construction Method of Chinese NT Clause Based on Attention-LSTM

  • Teng Mao
  • Yuyao Zhang
  • Yuru JiangEmail author
  • Yangsen Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11109)


The correct definition and recognition of sentences is the basis of NLP. For the characteristics of Chinese text structure, the theory of NT clause was proposed from the perspective of micro topics. Based on this theory, this paper proposes a novel method for construction NT clause. Firstly, this paper proposes a neural network model based on Attention and LSTM (Attention-LSTM), which can identify the location of the missing Naming, and uses manually annotated corpus to train the Attention-LSTM. Secondly, in the process of constructing NT clause, the trained Attention-LSTM is used to identify the location of the missing Naming. Then the NT clause can be constructed. The accuracy of the experimental result is 81.74% (+4.5%). This paper can provide support for the task of text understanding, such as Machine Translation, Information Extraction, Man-machine Dialogue.


NT clause Attention-LSTM Text understanding 



This work was supported by grants from National Nature Science Foundation of China (No. 61602044), National Nature Science Foundation of China (No. 61370139), Scientific Research Project of Beijing Educational Committee (No. KM201711232022).


  1. 1.
    Bloomfield, L.: Language. George Allen and Unwin, American (1933)Google Scholar
  2. 2.
    Jianming, L.: Characteristics of Chinese sentences. Chin. Lang. Learn. 1, 1–6 (1993)Google Scholar
  3. 3.
    Dexi, Zh.: Grammar Lecture. The Commercial Press, China (2003)Google Scholar
  4. 4.
    Fuyi, X.: Research on Chinese Complex Sentences. The Commercial Press, China (2001)Google Scholar
  5. 5.
    Fengpu, C., Wang, J.: Sentence and Clause Structure in Chinese: A Functional Perspective. Beijing Language and Culture University Press, China (2005)Google Scholar
  6. 6.
    Rou, S.: Chinese Clause Complex and the Naming Structure. Empirical and Corpus Linguistic Frontiers. China Social Sciences Press, China. (To be published, 2018)Google Scholar
  7. 7.
    Yuru, J., Rou, S.: Topic clause identification based on generalized topic theory. J. Chin. Inf. Process. 26(5), 114–119 (2012)Google Scholar
  8. 8.
    Yuru, J., Rou, S.: Topic Structure Identification of PClause Sequence Based on Generalized Topic Theory. Natural Language Processing and Chinese Computing, pp. 85–96 (2012)Google Scholar
  9. 9.
    Yuru, J., Rou, S.: Optimization of candidate topic clause evaluation function in topic clause identification. J. Beijing Univ. Technol. 40(1), 43–48 (2014)Google Scholar
  10. 10.
    Yuru, J., Rou, S.: Topic clause identification method based on specific features. J. Comput. Appl. 34(05), 1345–1349 (2014)Google Scholar
  11. 11.
    Sepp, H., Jürgen, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)Google Scholar
  12. 12.
    Dzmitry, B., Cho, K., Yoshua, B.: Neural machine translation by jointly learning to align and translate. CoRR, abs/1409.0473 (2014)Google Scholar
  13. 13.
    Tim, R., Grefenstette, E., Hermann, K.M., et al.: Reasoning about entailment with neural attention (2015)Google Scholar
  14. 14.
    Zeiler, M.D.: AdaDelta: an adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012)
  15. 15.
    Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. JMLR (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Teng Mao
    • 1
  • Yuyao Zhang
    • 1
  • Yuru Jiang
    • 1
    Email author
  • Yangsen Zhang
    • 1
  1. 1.Institute of Intelligent Information ProcessingBeijing Information Science and Technology UniversityBeijingChina

Personalised recommendations