Advertisement

Named Entity Recognition Based on BiRHN and CRF

  • DongYang ZhaoEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11204)

Abstract

Named entity recognition is one of the basic work in the field of natural language processing. By utilizing bidirectional LSTM, Lample achieved the best results in the field of named entity recognition in 2016. In this paper, we propose a new neural network structure based on Recurrent Highway Networks (BiRHN for short) and Conditional Random Field (CRF for short). RHN is a good solution to the problem caused by gradients, which extends the LSTM architecture to allow step-to-step transition depths larger than one. Experiments on several datasets show that our model achieves better results (F1 values) than Lample.

Keywords

BiRHN CRF NER 

References

  1. 1.
    Britz, D., Goldie, A., Luong, T., Le, Q.: Massive exploration of neural machine translation architectures. arXiv preprint arXiv:1703.03906 (2017)
  2. 2.
    Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
  3. 3.
    Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)zbMATHGoogle Scholar
  4. 4.
    Greff, K., Srivastava, R.K., Koutnik, J., Steunebrink, B.R., Schmidhuber, J.: LSTM: a search space odyssey. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2222–2223 (2017)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  6. 6.
    Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)
  7. 7.
    Lafferty, J.D., Mccallum, A., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data. In: Eighteenth International Conference on Machine Learning, pp. 282–289 (2001)Google Scholar
  8. 8.
    Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. arXiv preprint arXiv:1603.01360 (2016)
  9. 9.
    Ling, W., et al.: Not all contexts are created equal: better word representations with variable attention. In: Conference on Empirical Methods in Natural Language Processing, pp. 1367–1372 (2015)Google Scholar
  10. 10.
    Srivastava, R.K., Greff, K., Schmidhuber, J.: Training very deep networks. In: Advances in Neural Information Processing Systems, pp. 2377–2385 (2015)Google Scholar
  11. 11.
    Zilly, J.G., Srivastava, R.K., Koutnik, J., Schmidhuber, J.: Recurrent highway networks. arXiv preprint arXiv:1607.03474 (2016)
  12. 12.
    Passos, A., Kumar, V., Mccallum, A.: Lexicon infused phrase embeddings for named entity resolution. Computer Science (2014)Google Scholar
  13. 13.
    Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging. Computer Science (2015)Google Scholar
  14. 14.
    Luo, G., Huang, X., Lin, C.Y., et al.: Joint entity recognition and disambiguation. In: Conference on Empirical Methods in Natural Language Processing, pp. 879–888 (2016)Google Scholar
  15. 15.
    Chiu, J.P.C., Nichols, E.: Named entity recognition with bidirectional LSTM-CNNs. Computer Science (2016)Google Scholar
  16. 16.
    Lample, G., Ballesteros, M., Subramanian, S., et al.: Neural architectures for named entity recognition, pp. 260–270 (2016)Google Scholar
  17. 17.
    Gillick, D., Brunk, C., Vinyals, O., et al.: Multilingual language processing from bytes. Computer Science (2016)Google Scholar
  18. 18.
    Santos, C.N.D., Guimarães, V.: Boosting named entity recognition with neural character embeddings. Computer Science (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.National University of Defense TechnologyChangshaChina

Personalised recommendations