Adaptive Skip-Train Structured Regression for Temporal Networks

  • Martin Pavlovski
  • Fang Zhou
  • Ivan Stojkovic
  • Ljupco Kocarev
  • Zoran ObradovicEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10535)


A broad range of high impact applications involve learning a predictive model in a temporal network environment. In weather forecasting, predicting effectiveness of treatments, outcomes in healthcare and in many other domains, networks are often large, while intervals between consecutive time moments are brief. Therefore, models are required to forecast in a more scalable and efficient way, without compromising accuracy. The Gaussian Conditional Random Field (GCRF) is a widely used graphical model for performing structured regression on networks. However, GCRF is not applicable to large networks and it cannot capture different network substructures (communities) since it considers the entire network while learning. In this study, we present a novel model, Adaptive Skip-Train Structured Ensemble (AST-SE), which is a sampling-based structured regression ensemble for prediction on top of temporal networks. AST-SE takes advantage of the scheme of ensemble methods to allow multiple GCRFs to learn from several subnetworks. The proposed model is able to automatically skip the entire training or some phases of the training process. The prediction accuracy and efficiency of AST-SE were assessed and compared against alternatives on synthetic temporal networks and the H3N2 Virus Influenza network. The obtained results provide evidence that (1) AST-SE is \(\sim \)140 times faster than GCRF as it skips retraining quite frequently; (2) It still captures the original network structure more accurately than GCRF while operating solely on partial views of the network; (3) It outperforms both unweighted and weighted GCRF ensembles which also operate on subnetworks but require retraining at each timestep. Code and data related to this chapter are available at:



This research was supported in part by DARPA grant No. FA9550-12-1-0406 negotiated by AFOSR, the National Science Foundation grants NSF-SES-1447670, NSF-IIS-1636772, Temple University Data Science Targeted Funding Program, NSF grant CNS-1625061, Pennsylvania Department of Health CURE grant and ONR/ONR Global (grant No. N62909-16-1-2222).


  1. 1.
    Andonova, S., Elisseeff, A., Evgeniou, T., Pontil, M.: A simple algorithm for learning stable machines. In: ECAI, pp. 513–517. IOS Press (2002)Google Scholar
  2. 2.
    Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)zbMATHGoogle Scholar
  3. 3.
    Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000). CrossRefGoogle Scholar
  4. 4.
    Freund, Y., Schapire, R.E.: A desicion-theoretic generalization of on-line learning and an application to boosting. In: Vitányi, P. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995). CrossRefGoogle Scholar
  5. 5.
    Glass, J., Ghalwash, M.F., Vukicevic, M., Obradovic, Z.: Extending the modelling capacity of Gaussian conditional random fields while learning faster. In: AAAI, pp. 1596–1602 (2016)Google Scholar
  6. 6.
    Gligorijevic, D., Stojanovic, J., Obradovic, Z.: Uncertainty propagation in long-term structured regression on evolving networks. In: AAAI (2016)Google Scholar
  7. 7.
    Mendes-Moreira, J., Soares, C., Jorge, A.M., Sousa, J.F.D.: Ensemble approaches for regression: a survey. ACM Comput. Surv. (CSUR) 45(1), 10 (2012)CrossRefzbMATHGoogle Scholar
  8. 8.
    Qin, T., Liu, T.Y., Zhang, X.D., Wang, D.S., Li, H.: Global ranking using continuous conditional random fields. In: Advances in Neural Information Processing Systems, pp. 1281–1288 (2009)Google Scholar
  9. 9.
    Radosavljevic, V., Vucetic, S., Obradovic, Z.: Continuous conditional random fields for regression in remote sensing. In: ECAI (2010)Google Scholar
  10. 10.
    Ren, Y., Zhang, L., Suganthan, P.N.: Ensemble classification and regression-recent developments, applications and future directions [review article]. IEEE Comput. Intell. Mag. 11(1), 41–53 (2016)CrossRefGoogle Scholar
  11. 11.
    Stojanovic, J., Gligorijevic, D., Obradovic, Z.: Modeling customer engagement from partial observations. In: CIKM, pp. 1403–1412 (2016)Google Scholar
  12. 12.
    Stojanovic, J., Jovanovic, M., Gligorijevic, D., Obradovic, Z.: Semi-supervised learning for structured regression on partially observed attributed graphs. In: SDM (2015)Google Scholar
  13. 13.
    Stojkovic, I., Jelisavcic, V., Milutinovic, V., Obradovic, Z.: Distance based modeling of interactions in structured regression. In: IJCAI (2016)Google Scholar
  14. 14.
    Stojkovic, I., Jelisavcic, V., Milutinovic, V., Obradovic, Z.: Fast sparse gaussian markov random fields learning based on cholesky factorization. In: IJCAI (2017)Google Scholar
  15. 15.
    Stojkovic, I., Obradovic, Z.: Predicting sepsis biomarker progression under therapy. In: IEEE CBMS (2017)Google Scholar
  16. 16.
    Vujicic, T., Glass, J., Zhou, F., Obradovic, Z.: Gaussian conditional random fields extended for directed graphs. Mach. Learn. 106, 1–18 (2017)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Zaas, A.K., Chen, M., Varkey, J., Veldman, T., et al.: Gene expression signatures diagnose influenza and other symptomatic respiratory viral infections in humans. Cell Host & Microbe 6(3), 207–217 (2009)CrossRefGoogle Scholar
  18. 18.
    Zhou, F., Ghalwash, M., Obradovic, Z.: A fast structured regression for large networks. In: 2016 IEEE International Conference on Big Data, pp. 106–115 (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Martin Pavlovski
    • 1
    • 2
  • Fang Zhou
    • 1
  • Ivan Stojkovic
    • 1
    • 3
  • Ljupco Kocarev
    • 2
  • Zoran Obradovic
    • 1
    Email author
  1. 1.Temple UniversityPhiladelphiaUSA
  2. 2.Macedonian Academy of Sciences and ArtsSkopjeRepublic of Macedonia
  3. 3.University of BelgradeBelgradeSerbia

Personalised recommendations