Skip to main content

Combining Character-Level Representation for Relation Classification

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2017 (ICANN 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10614))

Included in the following conference series:

  • 4260 Accesses

Abstract

Word representation models have achieved great success in natural language processing tasks, such as relation classification. However, it does not always work on informal text, and the morphemes of some misspelling words may carry important short-distance semantic information. We propose a hybrid model, combining the merits of word-level and character-level representations to learn better representations on informal text. Experiments on the SemEval-2010 Task8 dataset for relation classification show that our model achieves a competitive result.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Miller, G.A.: WordNet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)

    Article  Google Scholar 

  2. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint (2014). arXiv:1412.3555

  3. Srivastava, R.K., Greff, K., Schmidhuber, J.: Training very deep networks. In: Advances in Neural Information Processing Systems (2015)

    Google Scholar 

  4. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint. arxiv:1301.3781 (2013)

  5. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP, vol. 14, pp. 1532–1543 (2014)

    Google Scholar 

  6. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J., et al.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)

    Google Scholar 

  7. dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. arXiv preprint. arxiv:1504.06580 (2015)

  8. Xu, Y., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency paths. In: EMNLP (2015)

    Google Scholar 

  9. Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: ACL, vol. 1, pp. 756–765 (2016)

    Google Scholar 

  10. Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Bo, X.: Attention-based bidirectional long short-term memory networks for relation classification. In: ACL, p. 207 (2016)

    Google Scholar 

  11. Wang, L., Cao, Z., de Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs. In: ACL, vol. 1, pp. 1298–1307 (2016)

    Google Scholar 

  12. Zhang, D., Wang, D.: Relation classification via recurrent neural network. arXiv preprint (2015). arXiv:1508.01006

  13. Zhang, X., Zhao, J., LeCun, Y.: Character-level convolutional networks for text classification. In: Advances in Neural Information Processing Systems, pp. 649–657 (2015)

    Google Scholar 

  14. Kim, Y., Jernite, Y., Sontag, D., Rush, A.M.: Character-aware neural language models. arXiv preprint (2015). arXiv:1508.06615

  15. Ling, W., Luís, T., Marujo, L., Astudillo, R.F., Amir, S., Dyer, C., Black, A.W., Trancoso, I.: Finding function in form: compositional character models for open vocabulary word representation. arXiv preprint (2015). arxiv:1508.02096

  16. Dhingra, B., Zhou, Z., Fitzpatrick, D., Muehl, M., Cohen, W.W.: Tweet2vec: character-based distributed representations for social media. arXiv preprint (2016). arxiv:1605.03481

  17. Nakov, P., Tiedemann, J.: Combining word-level and character-level models for machine translation between closely-related languages. In: ACL, pp. 301–305 (2012)

    Google Scholar 

  18. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint (2012). arXiv:1207.0580

  19. Mou, L., Meng, Z., Yan, R., Li, G., Yan, X., Zhang, L., Jin, Z.: How transferable are neural networks in NLP applications? In: ACL (2016)

    Google Scholar 

  20. Zeiler, M.D.: ADADELTA: an adaptive learning rate method. arXiv preprint (2012). arXiv:1212.5701

  21. Rink, B., Harabagiu, S.: UTD: classifying semantic relations by combining lexical and semantic resources. ACL, p. 256 (2010)

    Google Scholar 

Download references

Acknowledgments

This work was supported by 111 Project of China under Grant No. B08004, National Natural Science Foundation of China (61273217, 61300080, 61671078), and the Ph.D. Programs Foundation of Ministry of Education of China (20130005110004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dongyun Liang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Liang, D., Xu, W., Zhao, Y. (2017). Combining Character-Level Representation for Relation Classification. In: Lintas, A., Rovetta, S., Verschure, P., Villa, A. (eds) Artificial Neural Networks and Machine Learning – ICANN 2017. ICANN 2017. Lecture Notes in Computer Science(), vol 10614. Springer, Cham. https://doi.org/10.1007/978-3-319-68612-7_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68612-7_45

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-68611-0

  • Online ISBN: 978-3-319-68612-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics