Skip to main content

TBCNN for Dependency Trees in Natural Language Processing

  • Chapter
  • First Online:
Tree-Based Convolutional Neural Networks

Part of the book series: SpringerBriefs in Computer Science ((BRIEFSCOMPUTER))

Abstract

This chapter applies tree-based convolution to the dependency parse trees of natural language sentences, resulting in a new variant d-TBCNN. Since dependency trees are different from abstract syntax trees in Chap. 4 and constituency trees in Chap. 5, we need to design new model gadgets for d-TBCNN. The model is evaluated on two sentence classification tasks (sentiment analysis and question classification) and a sentence matching task. In the sentence classification tasks, d-TBCNN outperforms previous state-of-the-art results, whereas in the sentence matching task, d-TBCNN achieves comparable performance to the previous state-of-the-art model, which has a higher matching complexity.

Parts of the contents of this chapter were published in [17, 18]. Copyright \(\copyright \) 2015, 2016, Association for Computational Linguistics. Implementation code is available through our websites (https://sites.google.com/site/tbcnnsentence/ and https://sites.google.com/site/tbcnninference/).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The example is adapted from [12].

  2. 2.

    http://nlp.stanford.edu/software/lex-parser.shtml.

  3. 3.

    Rigorously, NLI is not aimed to “match” two sentences in the sense of information retrieval. However, NLI is indeed to model a pair of sentences, and can be thought of as matching two sentences in terms of entailment. Therefore, we are happy to abuse the terminology in our book.

  4. 4.

    http://nlp.stanford.edu/projects/snli/.

  5. 5.

    We applied collapsed dependency trees, where prepositions and conjunctions are annotated on the dependency relations, but these auxiliary words themselves are removed.

References

  1. Bos, J., Markert, K.: Combining shallow and deep NLP methods for recognizing textual entailment. In: Proceedings of the First PASCAL Challenges Workshop on Recognising Textual Entailment, pp. 65–68 (2005)

    Google Scholar 

  2. Bowman, S.R., Angeli, G., Potts, C., Manning, C.D.: A large annotated corpus for learning natural language inference. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 632–642 (2015)

    Google Scholar 

  3. Braud, C., Denis, P.: Comparing word representations for implicit discourse relation classification. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 2201–2211 (2015)

    Google Scholar 

  4. Bromley, J., Guyon, I., LeCun, Y., Säckinger, E., Shah, R.: Signature verification using a “ssiamese” time delay neural network. In: Advances in Neural Information Processing Systems, pp. 737–744 (1994)

    Google Scholar 

  5. Collobert, R., Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning, pp. 160–167 (2008)

    Google Scholar 

  6. Fox, C.: A stop list for general text. In: ACM SIGIR Forum, pp. 19–21 (1989)

    Article  Google Scholar 

  7. Harabagiu, S., Hickl, A.: Methods for using textual entailment in open-domain question answering. In: Proceedings of the 21st International Conference on Computational Linguistics and the 44th Annual Meeting of the Association for Computational Linguistics, pp. 905–912 (2006)

    Google Scholar 

  8. Harabagiu, S., Hickl, A., Lacatusu, F.: Negation, contrast and contradiction in text processing. In: Proceedings of AAAI Conference on Artificial Intelligence, pp. 755–762 (2006)

    Google Scholar 

  9. He, H., Gimpel, K., Lin, J.: Multi-perspective sentence similarity modeling with convolutional neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 17–21 (2015)

    Google Scholar 

  10. Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Advances in Neural Information Processing Systems, pp. 2042–2050 (2014)

    Google Scholar 

  11. Irsoy, O., Cardie, C.: Deep recursive neural networks for compositionality in language. In: Advances in Neural Information Processing Systems, pp. 2096–2104 (2014)

    Google Scholar 

  12. Jurafsky, D., Martin, J.: Speech and Language Processing. Pearson Education (2000)

    Google Scholar 

  13. Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1746–1751 (2014)

    Google Scholar 

  14. Lacatusu, F., Hickl, A., Roberts, K., Shi, Y., Bensley, J., Rink, B., Wang, P., Taylor, L.: LCCs GISTexter at DUC 2006: Multi-strategy multi-document summarization. In: Proceedings of DUC 2006 (2006)

    Google Scholar 

  15. MacCartney, B., Grenager, T., de Marneffe, M.C., Cer, D., Manning, C.D.: Learning to recognize features of valid textual entailments. In: Proceedings of the Human Language Technology Conference of the NAACL, pp. 41–48 (2006)

    Google Scholar 

  16. Mikolov, T., Yih, W.t., Zweig, G.: Linguistic regularities in continuous space word representations. In: Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 746–751 (2013)

    Google Scholar 

  17. Mou, L., Peng, H., Li, G., Xu, Y., Zhang, L., Jin, Z.: Discriminative neural sentence modeling by tree-based convolution. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 2315–2325 (2015)

    Google Scholar 

  18. Mou, L., Men, R., Li, G., Xu, Y., Zhang, L., Yan, R., Jin, Z.: Natural language inference by tree-based convolution and heuristic matching. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, vol. 2, pp. 130–136 (2016)

    Google Scholar 

  19. Rocktäschel, T., Grefenstette, E., Hermann, K.M., Kočiskỳ, T., Blunsom, P.: Reasoning about entailment with neural attention. In: Proceedings of the International Conference on Learning Representations (2016)

    Google Scholar 

  20. Silva, J., Coheur, L., Mendes, A., Wichert, A.: From symbolic to sub-symbolic information in question classification. Artif. Intell. Rev. 35(2), 137–154 (2011)

    Article  Google Scholar 

  21. Socher, R., Pennington, J., Huang, E., Ng, A., Manning, C.: Semi-supervised recursive autoencoders for predicting sentiment distributions. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, pp. 151–161 (2011)

    Google Scholar 

  22. Tai, K., Socher, R., Manning, D.: Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics, pp. 1556–1566 (2015)

    Google Scholar 

  23. Vendrov, I., Kiros, R., Fidler, S., Urtasun, R.: Order-embeddings of images and language. In: Proceedings of International Conference on Learning Representations (2016)

    Google Scholar 

  24. Yan, R., Song, Y., Wu, H.: Learning to respond with deep neural networks for retrieval-based human-computer conversation system. In: Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 55–64 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lili Mou .

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mou, L., Jin, Z. (2018). TBCNN for Dependency Trees in Natural Language Processing. In: Tree-Based Convolutional Neural Networks. SpringerBriefs in Computer Science. Springer, Singapore. https://doi.org/10.1007/978-981-13-1870-2_6

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-1870-2_6

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-1869-6

  • Online ISBN: 978-981-13-1870-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics