Skip to main content

RNN-SURV: A Deep Recurrent Model for Survival Analysis

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2018 (ICANN 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11141))

Included in the following conference series:

Abstract

Current medical practice is driven by clinical guidelines which are designed for the “average” patient. Deep learning is enabling medicine to become personalized to the patient at hand. In this paper we present a new recurrent neural network model for personalized survival analysis called rnn-surv. Our model is able to exploit censored data to compute both the risk score and the survival function of each patient. At each time step, the network takes as input the features characterizing the patient and the identifier of the time step, creates an embedding, and outputs the value of the survival function in that time step. Finally, the values of the survival function are linearly combined to compute the unique risk score. Thanks to the model structure and the training designed to exploit two loss functions, our model gets better concordance index (C-index) than the state of the art approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.unos.org/data/.

  2. 2.

    https://vincentarelbundock.github.io/Rdatasets/datasets.html/.

  3. 3.

    Implementation by lifelines package.

  4. 4.

    https://github.com/CamDavidsonPilon/lifelines/.

  5. 5.

    https://cran.r-project.org/web/packages/randomForestSRC/.

  6. 6.

    https://github.com/jaredleekatzman/DeepSurv/.

  7. 7.

    https://github.com/yanlirock/MTLSA/.

References

  1. Aalen, O.: A Model for nonparametric regression analysis of counting processes. In: Klonecki, W., Kozek, A., Rosiński, J. (eds.) Mathematical Statistics and Probability Theory. LNS, vol. 2, pp. 1–25. Springer, New York (1980). https://doi.org/10.1007/978-1-4615-7397-5_1

    Chapter  Google Scholar 

  2. Biganzoli, E., Boracchi, P., Mariani, L., Marubini, E.: Feed forward neural networks for the analysis of censored survival data: a partial logistic regression approach. Stat. Med. 17, 1169–1186 (1998)

    Article  Google Scholar 

  3. Breslow, N.E., Chatterjee, N.: Design and analysis of two-phase studies with binary outcome applied to Wilm’s tumour prognosis. Appl. Stat. 48, 457–468 (1999)

    MATH  Google Scholar 

  4. Buckley, J., James, I.: Linear regression with censored data. Biometrika 66(3), 429–436 (1979)

    Article  Google Scholar 

  5. Cox, D.R.: Partial likelihood. Biometrika 62(2), 269 (1975)

    Article  MathSciNet  Google Scholar 

  6. Dezfouli, H.N.: Improving gastric cancer outcome prediction using time-point artificial neural networks models. Cancer Inform. 16, 117693511668606 (2017)

    Article  Google Scholar 

  7. Faraggi, D., Simon, R.: A neural network model for survival data. Stat. Med. 14(1), 73–82 (1995)

    Article  Google Scholar 

  8. Gal, Y., Ghahramani, Z.: A theoretically grounded application of dropout in recurrent neural networks. In: 29th NIPS, pp. 1019–1027 (2016)

    Google Scholar 

  9. Harrell, F.J., Califf, R., Pryor, D., Lee, K., Rosati, R.: Evaluating the yield of medical tests. JAMA 247(18), 2543–2546 (1982)

    Article  Google Scholar 

  10. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  11. Ishwaran, H., Kogalur, U.B., Blackstone, E.H., Lauer, M.S.: Random survival forests. Ann. Appl. Stat. 2, 841–860 (2008)

    Article  MathSciNet  Google Scholar 

  12. Kaplan, E.L., Meier, P.: Non parametric estimation from incomplete observations. J. Am. Stat. Assoc. 53, 457–481 (1958)

    Article  Google Scholar 

  13. Katzman, J., Shaham, U., Cloninger, A., Bates, J., Jiang, T., Kluger, Y.: Deep survival: a deep cox proportional hazards network. CoRR (2016)

    Google Scholar 

  14. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. CoRR abs/1412.6980 (2014). http://arxiv.org/abs/1412.6980

  15. Klein, J.P., Moeschberger, M.L.: Survival Analysis Techniques for Censored and Truncated Data, 2nd edn. Springer, New York (2003)

    MATH  Google Scholar 

  16. Kyle, R., et al.: Use of monclonal serum immunoglobulin free light chains to predict overall survival in the general population. N. Engl. J. Med. 354, 1362–1369 (2006)

    Article  Google Scholar 

  17. Li, Y., Wang, J., Ye, J., Reddy, C.K.: A multi-task learning formulation for survival analysis. In: 22nd ACM SIGKDD, KDD 2016, pp. 1715–1724. ACM, New York (2016)

    Google Scholar 

  18. Liestbl, K., Andersen, P.K., Andersen, U.: Survival analysis and neural nets. Stat. Med. 13(12), 1189–1200 (1994)

    Article  Google Scholar 

  19. Nair, V., Hinton, G.E.: Rectified linear units improve restricted Boltzmann machines. In: 27th ICML, pp. 807–814 (2010)

    Google Scholar 

  20. Prechelt, L.: Early stopping—but when? In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 53–67. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_5

    Chapter  Google Scholar 

  21. Shivaswamy, P.K., Chu, W., Jansche, M.: A support vector approach to censored targets. In: Proceedings of 7th IEEE ICDM, ICDM 2007, pp. 655–660. IEEE Computer Society (2007). https://doi.org/10.1109/ICDM.2007.93

  22. Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  23. Steck, H., Krishnapuram, B., Dehing-oberije, C., Lambin, P., Raykar, V.C.: On ranking in survival analysis: bounds on the concordance index. In: Platt, J.C., Koller, D., Singer, Y., Roweis, S.T. (eds.) 20th NIPS, pp. 1209–1216. Curran Associates Inc., New York (2008)

    Google Scholar 

  24. Van Buuren, S., Oudshoorn, K.: Flexible mutlivariate imputation by MICE. TNO, Leiden (1999)

    Google Scholar 

  25. Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S. Springer, Heidelberg (2002). https://doi.org/10.1007/978-0-387-21706-2

    Book  MATH  Google Scholar 

  26. Yu, C.N., Greiner, R., Lin, H.C., Baracos, V.: Learning patient-specific cancer survival distributions as a sequence of dependent regressors. In: Shawe-Taylor, J., Zemel, R.S., Bartlett, P.L., Pereira, F., Weinberger, K.Q. (eds.) 24th NIPS, pp. 1845–1853. Curran Associates Inc., New York (2011)

    Google Scholar 

  27. Zhang, J., Chen, L., Vanasse, A., Courteau, J., Wang, S.: Survival prediction by an integrated learning criterion on intermittently varying healthcare data. In: 30th AAAI, AAAI 2016, pp. 72–78. AAAI Press (2016)

    Google Scholar 

  28. Zhu, X., Yao, J., Huang, J.: Deep convolutional neural network for survival analysis with pathological images. In: IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2016, Shenzhen, China, pp. 544–547 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eleonora Giunchiglia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Giunchiglia, E., Nemchenko, A., van der Schaar, M. (2018). RNN-SURV: A Deep Recurrent Model for Survival Analysis. In: Kůrková, V., Manolopoulos, Y., Hammer, B., Iliadis, L., Maglogiannis, I. (eds) Artificial Neural Networks and Machine Learning – ICANN 2018. ICANN 2018. Lecture Notes in Computer Science(), vol 11141. Springer, Cham. https://doi.org/10.1007/978-3-030-01424-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01424-7_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01423-0

  • Online ISBN: 978-3-030-01424-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics