Attention-Based Hierarchical Recurrent Neural Network for Phenotype Classification

  • Nan Xu
  • Yanyan ShenEmail author
  • Yanmin ZhuEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11439)


This paper focuses on labeling phenotypes of patients in Intensive Care Unit given their records from admission to discharge. Recent works mainly rely on recurrent neural networks to process such temporal data. However, such prevalent practice, which leverages the last hidden state in the network for sequence representation, falls short when dealing with long sequences. Moreover, the memorizing strategy inside the recurrent units does not necessarily identify the key health records for each specific class. In this paper, we propose an attention-based hierarchical recurrent neural network (AHRNN) for phenotype classification. Our intuition is to remember all the past records by a hierarchical structure and make predictions based on crucial information in the label’s perspective. To the best of our knowledge, it is the first work of applying attention-based hierarchical neural networks to clinical time series prediction. Experimental results show that our model outperforms the state-of-the-arts in accuracy, time efficiency and model interpretability.


Temporal data Classification Attention mechanism 



This research is supported in part by NSFC (No. 61772341, 61472254) and STSCM (No. 18511103002). This work is also supported by the Program for Changjiang Young Scholars in University of China, the Program for China Top Young Talents, the Program for Shanghai Top Young Talents, and Shanghai Engineering Research Center of Digital Education Equipment.


  1. 1.
    Bayati, M.: Data-driven decision making in healthcare systems (2011)Google Scholar
  2. 2.
    Cheng, W., Shen, Y., Zhu, Y., Huang, L.: A neural attention model for urban air quality inference: learning the weights of monitoring stations. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)Google Scholar
  3. 3.
    Chiu, C.C., et al.: State-of-the-art speech recognition with sequence-to-sequence models. In: 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4774–4778. IEEE (2018)Google Scholar
  4. 4.
    Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014)
  5. 5.
    Healthcare Cost and Utilization Project: Clinical classifications software (CCS) for ICD-9-CM. Accessed 11 May 2011
  6. 6.
    Dean, J., Ghemawat, S.: MapReduce: simplified data processing on large clusters. Commun. ACM 51(1), 107–113 (2008)CrossRefGoogle Scholar
  7. 7.
    Harutyunyan, H., Khachatrian, H., Kale, D.C., Galstyan, A.: Multitask learning and benchmarking with clinical time series data. arXiv preprint arXiv:1703.07771 (2017)
  8. 8.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  9. 9.
    Johnson, A.E., et al.: MIMIC-III, a freely accessible critical care database. Sci. Data 3, 160035 (2016)CrossRefGoogle Scholar
  10. 10.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
  11. 11.
    Lipton, Z.C., Kale, D.C., Elkan, C., Wetzell, R.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)
  12. 12.
    Nallapati, R., Zhou, B., Gulcehre, C., Xiang, B., et al.: Abstractive text summarization using sequence-to-sequence RNNs and beyond. arXiv preprint arXiv:1602.06023 (2016)
  13. 13.
    Peng, Z., et al.: Mining frequent subgraphs from tremendous amount of small graphs using MapReduce. Knowl. Inf. Syst. 56(3), 663–690 (2018)CrossRefGoogle Scholar
  14. 14.
    Purushotham, S., Meng, C., Che, Z., Liu, Y.: Benchmark of deep learning models on large healthcare mimic datasets. arXiv preprint arXiv:1710.08531 (2017)
  15. 15.
    Shickel, B., Tighe, P.J., Bihorac, A., Rashidi, P.: Deep EHR: a survey of recent advances in deep learning techniques for electronic health record (EHR) analysis. IEEE J. Biomed. Health Inf. 22(5), 1589–1604 (2018)CrossRefGoogle Scholar
  16. 16.
    Song, H., Rajan, D., Thiagarajan, J.J., Spanias, A.: Attend and diagnose: clinical time series analysis using attention models. arXiv preprint arXiv:1711.03905 (2017)
  17. 17.
    Staessen, J.A., Wang, J., Bianchi, G., Birkenhäger, W.H.: Essential hypertension. Lancet 361(9369), 1629–1641 (2003)CrossRefGoogle Scholar
  18. 18.
    Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. In: Advances in Neural Information Processing Systems, pp. 3104–3112 (2014)Google Scholar
  19. 19.
    Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 1422–1432 (2015)Google Scholar
  20. 20.
    Xierali, I.M., et al.: The rise of electronic health record adoption among family physicians. Ann. Fam. Med. 11(1), 14–19 (2013)CrossRefGoogle Scholar
  21. 21.
    Xiong, C., Merity, S., Socher, R.: Dynamic memory networks for visual and textual question answering. In: International Conference on Machine Learning, pp. 2397–2406 (2016)Google Scholar
  22. 22.
    Zhao, B., Li, X., Lu, X.: HSA-RNN: hierarchical structure-adaptive RNN for video summarization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7405–7414 (2018)Google Scholar
  23. 23.
    Zhao, Y., Shen, Y., Zhu, Y., Yao, J.: Forecasting wavelet transformed time series with attentive neural networks. In: 2018 IEEE International Conference on Data Mining (ICDM), pp. 1452–1457. IEEE (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Shanghai Engineering Research Center of Digital Education EquipmentShanghaiChina
  2. 2.Department of Computer Science and EngineeringShanghai Jiao Tong UniversityShanghaiChina

Personalised recommendations