Advertisement

A Monitoring Infrastructure for the Quality Assessment of Cloud Services

  • Priscila CedilloEmail author
  • Javier Gonzalez-Huerta
  • Silvia Abrahao
  • Emilio Insfran
Conference paper
Part of the Lecture Notes in Information Systems and Organisation book series (LNISO, volume 17)

Abstract

Service Level Agreements (SLAs) specify the strict terms under which cloud services must be provided. The assessment of the quality of services being provided is critical for both clients and service providers. In this context, stakeholders must be capable of monitoring services delivered as Software as a Service (SaaS) at runtime and of reporting any eventual non-compliance with SLAs in a comprehensive and flexible manner. In this paper, we present the definition of an SLA compliance monitoring infrastructure, which is based on the use of models@run.time, its main components and artifacts, and the interactions among them. We place emphasis on the configuration of the artifacts that will enable the monitoring, and we present a prototype that can be used to perform this monitoring. The feasibility of our proposal is illustrated by means of a case study, which shows the use of the components and artifacts in the infrastructure and the configuration of a specific plan with which to monitor the services deployed on the Microsoft Azure© platform.

Keywords

Model driven engineering Models@run.time Quality assessment Cloud services Service level agreements Software as a service 

Notes

Acknowledgments

This research has been supported by the Value@Cloud project (TIN2013-46300-R), Scholarship Program Senescyt-Ecuador, NSERC (Natural Sciences and Engineering Research Council of Canada) and Microsoft Azure for Research Award Program.

References

  1. 1.
    Sriram, I., Khajeh-Hosseini, A.: Research agenda in cloud technologies. In: 1st ACM Symposium on Cloud Computing, SOCC, pp. 1–11 (2010)Google Scholar
  2. 2.
    Song, J., Han, F., Yan, Z., Liu, G., Zhu, Z.: A SaaSify tool for converting traditional web-based apps to SaaS application. In: 4th International Conference on Cloud Computing, CLOUD, pp. 396–403 (2011)Google Scholar
  3. 3.
    Baset, S.A.: Cloud SLAs: present and future. ACM SIGOPS Oper. Syst. 46, 57–66 (2012)CrossRefGoogle Scholar
  4. 4.
    Hassan, M., Song, B., Huh, E.-N.: A market-oriented dynamic collaborative cloud services platform. Ann. Telecommun. 65, 669–688 (2010)CrossRefGoogle Scholar
  5. 5.
    Aceto, G., Botta, A., de Donato, W., Pescapè, A.: Cloud monitoring: a survey. Comput. Netw. 57, 2093–2115 (2013)CrossRefGoogle Scholar
  6. 6.
    Shao, J., Wei, H., Wang, Q., Mei, H.: A runtime model based monitoring approach for cloud. In: International Conference on Cloud Computing (CLOUD), pp. 313–320 (2010)Google Scholar
  7. 7.
    Foster, I., Zhao, Y., Raicu, I., Lu, S.: Cloud computing and grid computing 360-degree compared. In: Grid Computing Environments Workshop (GCE 08), pp. 1–10 (2008)Google Scholar
  8. 8.
    Muller, C., Oriol, M., Franch, X., Marco, J., Resinas, M., Ruiz-Cortes, A., Rodriguez, M.: Comprehensive explanation of SLA violations at runtime. IEEE Trans. Serv. Comput. 7, 168–183 (2014)CrossRefGoogle Scholar
  9. 9.
    Baresi, L., Ghezzi, C.: The disappearing boundary between development-time and run-time. In: Workshop on the Future of Software Engineering Research FSE/SDP, pp. 17–22. ACM, USA (2010)Google Scholar
  10. 10.
    Giese, H., Bencomo, N., Pasquale, L., Ramirez, A., Inverardi, P., Wätzoldt, S., Clarke, S.: Living with Uncertainty in the Age of Runtime Models. http://dx.doi.org/10.1007/978-3-319-08915-7_3
  11. 11.
    Cedillo, P., Gonzalez-Huerta, J., Insfrán, E., Abrahao, S.: Towards monitoring cloud services using Models@run.time. In: Workshop on Models@run.time, MODELS, pp. 31–40, Spain (2014)Google Scholar
  12. 12.
    Fatema, K., Emeakaroha, V.C., Healy, P.D., Morrison, J.P., Lynn, T.: A survey of cloud monitoring tools: taxonomy, capabilities and objectives (2014)Google Scholar
  13. 13.
    Alhamazani, K., Ranjan, R., Mitra, K., Rabhi, F., Jayaraman, P.P., Khan, S.U., Guabtni, A., Bhatnagar, V.: An overview of the commercial cloud monitoring tools: research dimensions, design issues, and state-of-the-art. Computing, pp. 1–21 (2014)Google Scholar
  14. 14.
    Emeakaroha, V.C., Ferreto, T.C., Netto, M.A.S., Brandic, I., De Rose, C.A.F.: CASViD: application level monitoring for SLA violation detection in clouds. In: Computer Software and Applications Conference (COMPSAC), pp. 499–508 (2012)Google Scholar
  15. 15.
    Katsaros, G., Kousiouris, G., Gogouvitis, S.V., Kyriazis, D., Menychtas, A., Varvarigou, T.: A self-adaptive hierarchical monitoring mechanism for Clouds. J. Syst. Softw. 85, 1029–1041 (2012)CrossRefGoogle Scholar
  16. 16.
    Smit, M., Simmons, B., Litoiu, M.: Distributed, application-level monitoring for heterogeneous clouds using stream processing. Future Gener. Comput. Syst. 29, 2103–2114 (2013)CrossRefGoogle Scholar
  17. 17.
    Montes, J., Sánchez, A., Memishi, B., Pérez, M.S., Antoniu, G.: GMonE: a complete approach to cloud monitoring. Future Gener. Comput. Syst. 29, 2026–2040 (2013)CrossRefGoogle Scholar
  18. 18.
    Povedano-Molina, J., Lopez-Vega, J.M., Lopez-Soler, J.M., Corradi, A., Foschini, L.: DARGOS: a highly adaptable and scalable monitoring architecture for multi-tenant Clouds. Future Gener. Comput. Syst. 29, 2041–2056 (2013)CrossRefGoogle Scholar
  19. 19.
    Cedillo, P., Jimenez-Gomez, J., Abrahao, S., Insfran, E.: Towards a monitoring middleware for Cloud services. In: International Conference on Services Computing (SCC), NY, USA (2015)Google Scholar
  20. 20.
    Ludwig, H., Keller, A.: Web Service Level Agreement (WSLA) Language Specification, pp. 1–110 (2003)Google Scholar
  21. 21.
    ISO/IEC: ISO/IEC 25010 Systems and Software Quality Requirements and Evaluation (SQuaRE)—System and software quality models (2011)Google Scholar
  22. 22.
    Lehmann, G., Blumendorf, M., Trollmann, F., Albayrak, S.: Meta-modeling runtime models. In: International Conference on Models in Software Engineering, pp. 209–223. Springer, Berlin (2010)Google Scholar
  23. 23.
    García, F., Bertoa, M.F., Calero, C., Vallecillo, A., Ruíz, F., Piattini, M., Genero, M.: Towards a consistent terminology for software measurement (2006)Google Scholar
  24. 24.
    Runeson, P., Höst, M.: Guidelines for conducting and reporting case study research in software engineering. Empirical Softw. Eng. 14, 131–164 (2009)CrossRefGoogle Scholar
  25. 25.
    Lethbridge, T.C., Sim, S.E., Singer, J.: Studying software engineers: data collection techniques for software field studies. Empirical Softw. Eng. 10, 311–341 (2005)CrossRefGoogle Scholar
  26. 26.
    Herold, S., Klus, H., Welsch, Y., Deiters, C., Rausch, A., Reussner, R., Krogmann, K., Koziolek, H., Mirandola, R., Hummel, B., Meisinger, M., Pfaller, C.: CoCoMe—the common component modeling example. Presented at the (2008)Google Scholar
  27. 27.
    Wieder, P., Butler, J.M., Theilmann, W., Yahyapour, R. (eds.): SLAs for Cloud Computing. Springer, New York (2011)Google Scholar
  28. 28.
    Quality Excellence for Suppliers of Telecommunications Forum (Quest Forum), TL 9000 Quality Management System Measurements Handbook 5.0 (2012)Google Scholar
  29. 29.
    Bauer, E., Adams, R.: Service Quality of Cloud-Based Applications. Wiley, Hoboken (2013)CrossRefGoogle Scholar
  30. 30.
    Fernandez, A., Abrahão, S., Insfran, E.: A web usability evaluation process for model-driven web development. In: International Conference on Advanced Information Systems Engineering, pp. 108–122 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Priscila Cedillo
    • 1
    • 2
    Email author
  • Javier Gonzalez-Huerta
    • 3
  • Silvia Abrahao
    • 1
  • Emilio Insfran
    • 1
  1. 1.Department of Computer Systems and ComputationUniversitat Politècnica de ValènciaValenciaSpain
  2. 2.Department of Computer Science, Faculty of EngineeringUniversity of CuencaCuencaEcuador
  3. 3.Département d’InformatiqueUniversité du Québec à MontréalMontrealCanada

Personalised recommendations