Advertisement

Aggregating Class Interactions for Hierarchical Attention Relation Extraction

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10635)

Abstract

Distantly supervised relation extraction is a powerful learning method to recognize relations of entity pairs. However, wrong label problem is inevitable among large-scale training data. In this work we propose a hierarchical attention neural network to effectively alleviate the impact of noise instances. Moreover under distantly supervised scenario, connections and dependencies widely appear among relation classes, which we call class interactions. Previous end-to-end methods that considered the relations as independent failed to make use of these interactions. To better utilize these important interactions, we propose a soft target as training objective to learn class relationships jointly. Experiments show that our model outperforms state-of-the-art methods.

Keywords

Distant supervision Hierarchicial attention Soft target 

Notes

Acknowledgments

This work is supported by the Fundamental Research Funds for the Central Universities (2017RC02) and Beijing Natural Science Foundation (4174098).

References

  1. 1.
    Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. Computer Science (2014)Google Scholar
  2. 2.
    Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250. ACM (2008)Google Scholar
  3. 3.
    Cai, R., Zhang, X., Wang, H.: Bidirectional Recurrent Convolutional Neural Network for Relation Classification. In: Meeting of the Association for Computational Linguistics, pp. 756–765 (2016)Google Scholar
  4. 4.
    Han, X., Sun, L.: Global distant supervision for relation extraction. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, pp. 2950–2956. AAAI Press (2016)Google Scholar
  5. 5.
    Hinton, G., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network. Comput. Sci. 14(7), 38–39 (2015)Google Scholar
  6. 6.
    Hoffmann, R., Zhang, C., Ling, X., Zettlemoyer, L., Weld, D.S.: Knowledge-based weak supervision for information extraction of overlapping relations. In: Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 541–550. Association for Computational Linguistics (2011)Google Scholar
  7. 7.
    Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: Meeting of the Association for Computational Linguistics, pp. 2124–2133 (2016)Google Scholar
  8. 8.
    Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. Computer Science (2013)Google Scholar
  9. 9.
    Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: Joint Conference of the, Meeting of the ACL and the, International Joint Conference on Natural Language Processing of the AFNLP, pp. 1003–1011. Association for Computational Linguistics (2009)Google Scholar
  10. 10.
    Miwa, M., Bansal, M.: End-to-end relation extraction using LSTMs on sequences and tree structures. arXiv preprint arXiv:1601.00770 (2016)
  11. 11.
    Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS, vol. 6323, pp. 148–163. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-15939-8_10 CrossRefGoogle Scholar
  12. 12.
    Santos, C.N.D., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. Computer Science (2015)Google Scholar
  13. 13.
    Socher, R., Huval, B., Manning, C.D., Ng, A.Y.: Semantic compositionality through recursive matrix-vector spaces. In: Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 1201–1211 (2012)Google Scholar
  14. 14.
    Surdeanu, M., Tibshirani, J., Nallapati, R., Manning, C.D.: Multi-instance multi-label learning for relation extraction. In: Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 455–465 (2012)Google Scholar
  15. 15.
    Wang, L., Cao, Z., Melo, G.D., Liu, Z.: Relation classification via multi-level attention CNNs. In: Meeting of the Association for Computational Linguistics, pp. 1298–1307 (2016)Google Scholar
  16. 16.
    Yan, X., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency path. Computer Science (2015)Google Scholar
  17. 17.
    Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., Hovy, E.: Hierarchical attention networks for document classification. In: Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 1480–1489 (2016)Google Scholar
  18. 18.
    Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)Google Scholar
  19. 19.
    Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: Conference on Empirical Methods in Natural Language Processing, pp. 1753–1762 (2015)Google Scholar
  20. 20.
    Zheng, H., Li, Z., Wang, S., Yan, Z., Zhou, J.: Aggregating inter-sentence information to enhance relation extraction. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, pp. 3108–3114. AAAI Press (2016)Google Scholar
  21. 21.
    Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: Meeting of the Association for Computational Linguistics (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Beijing University of Posts and TelecommunicationsBeijingChina

Personalised recommendations