Advertisement

Experimenting with Application-Based Benchmarks on Different Cloud Providers via a Multi-cloud Execution and Modeling Framework

  • Athanasia Evangelinou
  • Nunzio Andrea Galante
  • George Kousiouris
  • Gabriele Giammatteo
  • Elton Kevani
  • Christoforos Stampoltas
  • Andreas Menychtas
  • Aliki Kopaneli
  • Kanchanna Ramasamy Balraj
  • Dimosthenis Kyriazis
  • Theodora Varvarigou
  • Peter Stuer
  • Leire Orue-Echevarria Arrieta
  • Gorka Mikel Echevarria Velez
  • Alexander Bergmayr
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 512)

Abstract

Cloud services are emerging today as an innovative IT provisioning model, offering benefits over the traditional approach of provisioning infrastructure. However, the occurrence of multi-tenancy, virtualization and resource sharing issues raise certain difficulties in providing performance estimation during application design or deployment time. In order to assess the performance of cloud services and compare cloud offerings, cloud benchmarks are required. The aim of this paper is to present a mechanism and a benchmarking process for measuring the performance of various cloud service delivery models, while describing this information in a machine understandable format. The suggested framework is responsible for organizing the execution and may support multiple cloud providers. In our work context, benchmarking measurement results are demonstrated from three large commercial cloud providers, Amazon EC2, Microsoft Azure and Flexiant in order to assist with provisioning decisions for cloud users. Furthermore, we present approaches for measuring service performance with the usage of specialized metrics for ranking the services according to a weighted combination of cost, performance and workload.

Keywords

Benchmarking Cloud services Multi-cloud Performance benchmarking 

Notes

Acknowledgements

The research leading to these results is partially supported by the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 317859, in the context of the ARTIST Project.

References

  1. 1.
    ARTIST Consortium (2013), Deliverable D7.2 v1.0- PaaS/IaaS Metamodelling Requirements and SOTA. http://www.artist-project.eu/sites/default/files/D7.2%20PaaS%20IaaS%20metamodeling%20requirements%20and%20SOTA_M4_31012013.pdf
  2. 2.
    REMICS Consortium (2012), Deliverable D4.1 v2.0 - PIM4Cloud. http://www.remics.eu/system/files/REMICS_D4.1_V2.0_LowResolution.pdf
  3. 3.
    Kousiouris, G., Kyriazis, D., Menychtas, A., Varvarigou, T.: Legacy applications on the cloud: challenges and enablers focusing on application performance analysis and providers characteristics. In: Proceedings of the 2012 2nd IEEE International Conference on Cloud Computing and Intelligence Systems (IEEE CCIS 2012), Hangzhou, China, 30 October–1 November 2012Google Scholar
  4. 4.
    Hauck, M., Huber, M., Klems, M., Kounev, S., Muller-Quade, J., Pretschner, A., Reussner, R., Tai, S.: Challenges and opportunities of Cloud computing. Karlsruhe Reports in Informatics 19, Karlsruhe Institute of Technology - Faculty of Informatics (2010)Google Scholar
  5. 5.
    Ou, Z., Zhuang, H., Nurminen, J.K., Ylä-Jääski, A., Hui, P.: Exploiting hardware heterogeneity within the same instance type of Amazon EC2. In: Proceedings of the 4th USENIX Conference on Hot Topics in Cloud Computing (HotCloud 2012), p. 4. USENIX Association, Berkeley (2012)Google Scholar
  6. 6.
    Kousiouris, G., Cucinotta, T., Varvarigou, T.: The effects of scheduling, workload type and consolidation scenarios on virtual machine performance and their prediction through optimized artificial neural networks. J. Syst. Softw. 84(8), 1270–1291 (2011). doi: 10.1016/j.jss.2011.04.013. ElsevierCrossRefGoogle Scholar
  7. 7.
    Koh, Y., Knauerhase, R., Brett, P., Bowman, M., Wen, Z., Pu, C.: An analysis of performance interference effects in virtual environments. In: IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS), pp. 200–209, April 2007Google Scholar
  8. 8.
    Schaffrath, G., Schmid, S., Vaishnavi, I., Khan, A., Feldmann, A.: A resource description language with vagueness support for multi-provider cloud networks. In: International Conference on Computer Communication Networks (ICCCN 2012), Munich, Germany (2012)Google Scholar
  9. 9.
    Charlton, S.: Model driven design and operations for the Cloud. In: OOPSLA 2009, 14th Conference Companion on Object Oriented Programming Systems Languages and Applications, pp. 17–26 (2009)Google Scholar
  10. 10.
    Charão, A.S., Primet, P.V.-B., Koslovski, G.P.: VXDL: virtual resources and interconnection networks description language. In: Kudoh, T., Mambretti, J., Vicat-Blanc Primet, P. (eds.) GridNets 2008. LNICST, vol. 2, pp. 138–154. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  11. 11.
    Mirkovic, J., Faber, T., Hsieh, P., Malayandisamu, G., Malavia, R.: DADL: Distributed Application Description Language. USC/ISI Technical report ISI-TR-664 (2010)Google Scholar
  12. 12.
    Garg, S.K., Versteeg, S., Buyya, R.: A framework for ranking of Cloud computing services. Future Gener. Comput. Syst. 29(4), 1012–1023 (2013). ISSN 0167-739X http://dx.doi.org/10.1016/j.future.2012.06.006 CrossRefGoogle Scholar
  13. 13.
    Iosup, A., Prodan, R., Epema, D.: Iaas cloud benchmarking: approaches, challenges, and experience. In: Proceedings of the International Conference on High Performance Networking and Computing (SC), MTAGS 2012, pp. 1–8. IEEE/ACM (2012)Google Scholar
  14. 14.
    Li, A., Yang, X., Kandula, S., Zhang, M.: CloudCmp: comparing public Cloud providers. In: Proceedings of the 10th ACM SIGCOMM Conference on Internet measurement (IMC 2010), pp. 1–14. ACM, New York (2010). doi: 10.1145/1879141.1879143, http://doi.acm.org/10.1145/1879141.1879143
  15. 15.
    ARTIST Consortium, Deliverable D7.2.1 v1.0- Cloud services modelling and performance analysis framework (2013). http://www.artist-project.eu/sites/default/files/D7.2.1%20Cloud%20services%20modeling%20and%20performance%20analysis%20framework_M12_30092013.pdf
  16. 16.
    Folkerts, E., Alexandrov, A., Sachs, K., Iosup, A., Markl, V., Tosun, C.: Benchmarking in the Cloud: what it should, can, and cannot be. In: Nambiar, R., Poess, M. (eds.) TPCTC 2012. LNCS, vol. 7755, pp. 173–188. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  17. 17.
    Milenkoski, A., Iosup, A., Kounev, S., Sachs, K., Rygielski, P., Ding, J., Cirne, W., Rosenberg, F.: Cloud Usage Patterns: A Formalism for Description of Cloud Usage Scenarios. Technical report SPEC-RG-2013-001 v.1.0.1, SPEC Research Group - Cloud Working Group, Standard Performance Evaluation Corporation (SPEC), April 2013Google Scholar
  18. 18.
    Ardagna, D., Di Nitto, E., Casale, G., Petcu, D., Mohagheghi, P., Mosser, S., Matthews, P., Gericke, A., Balligny, C., D’Andria, F., Nechifor, C.-S., Sheridan, C.: MODACLOUDS: a model-driven approach for the design and execution of applications on multiple clouds. In: ICSE MiSE: International Workshop on Modelling in Software Engineering, pp. 50–56. IEEE/ACM (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Athanasia Evangelinou
    • 1
  • Nunzio Andrea Galante
    • 2
  • George Kousiouris
    • 1
  • Gabriele Giammatteo
    • 2
  • Elton Kevani
    • 1
  • Christoforos Stampoltas
    • 1
  • Andreas Menychtas
    • 1
  • Aliki Kopaneli
    • 1
  • Kanchanna Ramasamy Balraj
    • 2
  • Dimosthenis Kyriazis
    • 1
  • Theodora Varvarigou
    • 1
  • Peter Stuer
    • 3
  • Leire Orue-Echevarria Arrieta
    • 4
  • Gorka Mikel Echevarria Velez
    • 4
  • Alexander Bergmayr
    • 5
  1. 1.Department of Electrical and Computer EngineeringNTUAAthensGreece
  2. 2.Research and Development LaboratoryEngineering Ingegneria Informatica S.p.A.RomeItaly
  3. 3.Spikes Research DepartmentSpikesAntwerpBelgium
  4. 4.ICT-European Software Institute DivisionTECNALIAZamudioSpain
  5. 5.Business Informatics GroupTU ViennaViennaAustria

Personalised recommendations