Cluster Computing

, Volume 10, Issue 4, pp 425–442 | Cite as

Learning-aided predictor integration for system performance prediction

Article

Abstract

The integration of multiple predictors promises higher prediction accuracy than the accuracy that can be obtained with a single predictor. The challenge is how to select the best predictor at any given moment. Traditionally, multiple predictors are run in parallel and the one that generates the best result is selected for prediction. In this paper, we propose a novel approach for predictor integration based on the learning of historical predictions. Compared with the traditional approach, it does not require running all the predictors simultaneously. Instead, it uses classification algorithms such as k-Nearest Neighbor (k-NN) and Bayesian classification and dimension reduction technique such as Principal Component Analysis (PCA) to forecast the best predictor for the workload under study based on the learning of historical predictions. Then only the forecasted best predictor is run for prediction. Our experimental results show that it achieved 20.18% higher best predictor forecasting accuracy than the cumulative MSE based predictor selection approach used in the popular Network Weather Service system. In addition, it outperformed the observed most accurate single predictor in the pool for 44.23% of the performance traces.

Keywords

System performance Virtual machine Virtual machine monitor k-Nearest Neighbor (kNN) Bayesian classification Principal component analysis (PCA) Time-series prediction 

References

  1. 1.
    Foster, I.: The anatomy of the grid: enabling scalable virtual organizations. In: Proceedings of First IEEE/ACM International Symposium on Cluster Computing and the Grid, pp. 6–7 (2001) Google Scholar
  2. 2.
    Krsul, I., Ganguly, A., Zhang, J., Fortes, J., Figueiredo, R.: Vmplants: providing and managing virtual machine execution environments for grid computing. In: Proceedings of Supercomputing’04, Washington, DC, 6–12 November 2004 Google Scholar
  3. 3.
    Pinter, S., Aridor, Y., Shultz, S., Guenender, S.: Improving machine virtualization with ‘hotplug memory’. In: Proceedings of the 17th International Symposium on Computer Architecture and High Performance Computing, pp. 168–175 (2005) Google Scholar
  4. 4.
    Wolski, R.: Dynamically forecasting network performance using the network weather service. J. Clust. Comput. (1998) Google Scholar
  5. 5.
    Matsuba, I., Suyari, H., Weon, S., Sato, D.: Practical chaos time series analysis with financial applications. In: Proceedings of 5th International Conference on Signal Processing, vol. 1, Beijing, pp. 265–271 (2000) Google Scholar
  6. 6.
    Pavlidis, N., Tasoulis, D., Vrahatis, M.: Financial forecasting through unsupervised clustering and evolutionary trained neural networks. In: The 2003 Congress on Evolutionary Computation, vol. 4, pp. 2314–2321 (2003) Google Scholar
  7. 7.
    Cheng, L., Marsic, I.: Modeling and prediction of session throughput of constant bit rate streams in wireless data networks. IEEE Wirel. Commun. Netw. 3, 1733–1741 (2003) Google Scholar
  8. 8.
    Magni, P., Bellazzi, R.: A stochastic model to assess the variability of blood glucose time series in diabetic patients self-monitoring. IEEE Trans. Biomed. Eng. 53(6), 977–985 (2006) CrossRefGoogle Scholar
  9. 9.
    Didan, K., Huete, A.: Analysis of the global vegetation dynamic metrics using modis vegetation index and land cover products. In: IEEE International Geoscience and Remote Sensing Symposium (IGARSS’04), vol. 3, pp. 2058–2061 (2004) Google Scholar
  10. 10.
    Chan, P., Mahoney, M.: Modeling multiple time series for anomaly detection. In: 5th IEEE International Conference on Data Mining, pp. 90–97 (2005) Google Scholar
  11. 11.
    Dinda, P.: The statistical properties of host load. Sci. Program. 7(3–4), (1999) Google Scholar
  12. 12.
    Dinda, P.: Host load prediction using linear models. Clust. Comput. 3(4) (2000) Google Scholar
  13. 13.
    Dinda, P.: Design, implementation, and performance of an extensible toolkit for resource prediction in distributed systems. IEEE Trans. Parallel Distrib. Syst. 17(2) (2006) Google Scholar
  14. 14.
    Yang, L., Schopf, J.M., Foster, I.: Conservative scheduling: Using predicted variance to improve scheduling decisions in dynamic environments. In: Proceedings of the ACM/IEEE Conference on Supercomputing, November 15–21, 2003, p. 31 Google Scholar
  15. 15.
    Zhang, Y., Sun, W., Inoguchi, Y.: CPU load predictions on the computational grid. In: 6th IEEE International Symposium on Cluster Computing and the Grid, CCGRID 06, vol. 1, May 2006, pp. 321–326 (2006) Google Scholar
  16. 16.
    Liang, J., Nahrstedt, K., Zhou, Y.: Adaptive multi-resource prediction in distributed resource sharing environment. In: IEEE International Symposium on Cluster Computing and the Grid, pp. 293–300 (2004) Google Scholar
  17. 17.
    Vazhkudai, S., Schopf, J.: Predicting sporadic grid data transfers. In: Proceedings of International Symposium on High Performance Distributed Computing, pp. 188–196 (2002) Google Scholar
  18. 18.
    Vazhkudai, S., Schopf, J., Foster, I.: Using disk throughput data in predictions of end-to-end grid data transfers. In: The 3rd International Workshop on Grid Computing (GRID 2002), November 2002 Google Scholar
  19. 19.
    Gunter, S., Bunke, H.: An evaluation of ensemble methods in handwritten word recognition based on feature selection. In: Proceedings of the 17th International Conference on Pattern Recognition, ICPR, vol. 1, August 2004, pp. 388–392 Google Scholar
  20. 20.
    Jain, G., Ginwala, A., Aslandogan, Y.: An approach to text classification using dimensionality reduction and combination of classifiers. In: Proceedings of the 2004 IEEE International Conference on Information Reuse and Integration, IRI 2004, November 2004, pp. 564–569 Google Scholar
  21. 21.
    Goldberg, R.P.: Survey of virtual machine research. IEEE Comput. Mag. 7(6), 34–45 (1974) Google Scholar
  22. 22.
    Figueiredo, R., Dinda, P., Fortes, J.: A case for grid computing on virtual machines. In: Proceedings of 23rd International Conference on Distributed Computing Systems, 19–22 May 2003, pp. 550–559 (2003) Google Scholar
  23. 23.
    Clark, C., Fraser, K., Hand, S., Hanseny, J., July, E., Limpach, C., Pratt, I., Warfield, A.: Live migration of virtual machines. In: Proceedings of NSDI’05, Boston, MA (2005) Google Scholar
  24. 24.
    Sapuntzakis, C., Chandra, R., Pfaff, B., Chow, J., Lam, M., Rosenblum, M.: Optimizing the migration of virtual computers. In: Proceedings of the Fifth Symposium on Operating Systems Design and Implementation, December 2002, pp. 377–390 (2002) Google Scholar
  25. 25.
    Kozuch, M., Satyanarayanan, M.: Internet suspend/resume. In: Proceedings of Fourth IEEE Workshop on Mobile Computing Systems and Applications, pp. 40–46 (2002) Google Scholar
  26. 26.
    Smith, J., Nair, R.: Virtual Machines. Kaufmann, New York (2005) MATHGoogle Scholar
  27. 27.
    Zhang, J., Figueiredo, R.: Application classification through monitoring and learning of resource consumption patterns. In: Proceedings of IPDPS’06, Rhodes Island, Greece, 25–29 April 2006 Google Scholar
  28. 28.
    Zhang, J., Figueiredo, R.: Autonomic feature selection for application classification. In: Proceedings of ICAC ’06, pp. 43–52 (2006) Google Scholar
  29. 29.
    Barham, P., Dragovic, B., Fraser, K., Hand, S., Harris, T., Ho, A., Neugebauer, R., Pratt, I., Warfield, A.: Xen and the art of virtualization. In: Proceedings of the 19th ACM Symposium on Operating Systems Principles, Bolton Landing, NY, USA (2003) Google Scholar
  30. 30.
  31. 31.
    Keahey, K., Foster, I., Freeman, T., Zhang, X.: Virtual workspaces: Achieving quality of service and quality of life in the grid. Sci. Program. J. 13(4), 265–276 (2005) Google Scholar
  32. 32.
    Reed, D., Pratt, I., Menage, P., Early, S., Stratford, N.: Xenoservers: accountable execution of untrusted programs. In: Proceedings of the Seventh Workshop on Hot Topics in Operating Systems, Rio Rico, AZ, pp. 136–141 (1999) Google Scholar
  33. 33.
    V. white paper, Comparing the mui, virtualcenter, and vmkusage. [Online]. Available: www.vmware.com/pdf/mui_vmkusage2.pdf
  34. 34.
    Cryer, J.D.: Time Series Analysis. Duxbury, Boston (1986) MATHGoogle Scholar
  35. 35.
    John, S.G., Rawlings, O., Dickey, D.A.: Applied Regression Analysis. Springer, Berlin (2001) Google Scholar
  36. 36.
    Trevor Hastie, R.T., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2001) MATHGoogle Scholar
  37. 37.
    Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. Wiley, New York (2001) MATHGoogle Scholar
  38. 38.
    Bingham, E., Mannila, H.: Random projection in dimensionality reduction: applications to image and text data. In: Knowledge Discovery and Data Mining, pp. 245–250, 2001 Google Scholar
  39. 39.
    Sirovich, L., Everson, R.: Management and analysis of large scientific datasets. Int. J. Supercomput. Appl. 6(1), 50–68 (1992) Google Scholar
  40. 40.
    Yang, J., Zhang, Y., Kisiel, B.: A scalability analysis of classifiers in text categorization. In: ACM SIGIR’03, pp. 96–103 (2003) Google Scholar
  41. 41.
    Friedman, F., Baskett, J.H., Shustek, L.: An algorithm for finding nearest neighbors. IEEE Trans. Comput. 24(10), 1000–1006 (1975) MATHCrossRefGoogle Scholar
  42. 42.
    Friedman, J., Bentley, J.H., Finkel, R.: An algorithm for finding best matches in logarithmic expected time. ACM Trans. Math. Softw. 3, 209–226 (1977) MATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2007

Authors and Affiliations

  1. 1.Advanced Computing and Information Systems (ACIS) Laboratory, Department of Electrical and Computer EngineeringUniversity of FloridaGainesvilleUSA

Personalised recommendations