A Benchmark Model for the Creation of Compute Instance Performance Footprints

  • Markus UllrichEmail author
  • Jörg Lässig
  • Jingtao Sun
  • Martin Gaedke
  • Kento Aida
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11226)


Cloud benchmarking has become a hot topic in cloud computing research. The idea to attach performance footprints to compute resources in order to select an appropriate setup for any application is very appealing. Especially in the scientific cloud, a lot of resources can be preserved by using just the right setup instead of needlessly over-provisioned instances. In this paper, we briefly list existing efforts that have been made in this area and explain the need for a generic benchmark model to combine the results found in previous work to reduce the benchmarking effort for new resources and applications. We propose such a model which is build on our previously presented resource and application model and highlight its advantages. We show how the model can be used to store benchmarking data and how the data is linked to the application and the resources. Also, we explain how the data, in combination with an infrastructure as code tool, can be utilized to automatically create and execute any application and any micro benchmark in the cloud with low manual effort. Finally, we present some of the observations we made while benchmarking compute instances at two major cloud providers.


Cloud computing Performance footprints Cloud benchmarking Compute instances 


  1. 1.
    Alejandra, R.M., Rajkumar, B.: A taxonomy and survey on scheduling algorithms for scientific workflows in IaaS cloud computing environments. Concurr. Comput.: Pract. Exp. 29(8), e4041 (2016). Scholar
  2. 2.
    Armbrust, M., et al.: A view of cloud computing. Commun. ACM 53(4), 50–58 (2010). Scholar
  3. 3.
    Bankole, A., Ajila, S.: Cloud client prediction models for cloud resource provisioning in a multitier web application environment. In: 2013 IEEE 7th International Symposium on Service Oriented System Engineering (SOSE), pp. 156–161, March 2013Google Scholar
  4. 4.
    Baset, S., Silva, M., Wakou, N.: Spec cloud™IaaS 2016 benchmark. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering, ICPE 2017, p. 423. ACM, New York (2017).
  5. 5.
    Binnig, C., Kossmann, D., Kraska, T., Loesing, S.: How is the weather tomorrow? Towards a benchmark for the cloud. In: Proceedings of the Second International Workshop on Testing Database Systems, pp. 1–6 (2009).
  6. 6.
    Borhani, A., Leitner, P., Lee, B.S., Li, X., Hung, T.: Wpress: an application-driven performance benchmark for cloud-based virtual machines. In: 2014 IEEE 18th International on Enterprise Distributed Object Computing Conference (EDOC), pp. 101–109, September 2014Google Scholar
  7. 7.
    Chhetri, M., Chichin, S., Vo, Q.B., Kowalczyk, R.: Smart CloudBench - automated performance benchmarking of the cloud. In: 2013 IEEE Sixth International Conference on Cloud Computing (CLOUD), pp. 414–421, June 2013Google Scholar
  8. 8.
    Coutinho, R., Frota, Y., Ocaña, K., de Oliveira, D., Drummond, L.M.A.: A dynamic cloud dimensioning approach for parallel scientific workflows: a case study in the comparative genomics domain. J. Grid Comput. 14(3), 443–461 (2016). Scholar
  9. 9.
    Ferdman, M., et al.: Clearing the clouds: a study of emerging scale-out workloads on modern hardware. SIGPLAN Not. 47(4), 37–48 (2012). Scholar
  10. 10.
    Leitner, P., Cito, J.: Patterns in the chaos–a study of performance variation and predictability in public IaaS clouds. ACM Trans. Internet Technol. 16(3), 15:1–15:23 (2016). Scholar
  11. 11.
    Li, A., Yang, X., Kandula, S., Zhang, M.: CloudCmp: comparing public cloud providers. In: ACM SIGCOMM, vol. 10, pp. 1–14 (2010).
  12. 12.
    Mell, P., Grance, T.: The NIST definition of cloud computing, January 2011Google Scholar
  13. 13.
    Sadooghi, I., et al.: Understanding the performance and potential of cloud computing for scientific applications. IEEE Trans. Cloud Comput. PP(99), 1 (2015)Google Scholar
  14. 14.
    Scheuner, J., Leitner, P.: A cloud benchmark suite combining micro and applications benchmarks. In: Companion of the 2018 ACM/SPEC International Conference on Performance Engineering, ICPE 2018, pp. 161–166. ACM, New York (2018).
  15. 15.
    Scheuner, J., Leitner, P., Cito, J., Gall, H.: Cloud WorkBench - infrastructure-as-code based cloud benchmarking. CoRR abs/1408.4565 (2014)Google Scholar
  16. 16.
    Sobel, W., et al.: Cloudstone: multi-platform, multi-language benchmark and measurement tools for web 2.0. Technical report, UC Berkeley and Sun Microsystems (2008)Google Scholar
  17. 17.
    Stockton, D.B., Santamaria, F.: Automating neuron simulation deployment in cloud resources. Neuroinformatics 15(1), 51–70 (2017). Scholar
  18. 18.
    Tak, B.C., Tang, C., Huang, H., Wang, L.: PseudoApp: performance prediction for application migration to cloud. In: 2013 IFIP/IEEE International Symposium on Integrated Network Management (IM 2013), pp. 303–310, May 2013Google Scholar
  19. 19.
    Ullrich, M., Laessig, J., Gaedke, M., Aida, K., Sun, J., Tanjo, T.: An application meta-model to support the execution and benchmarking of scientific applications in multi-cloud environments. In: 3rd IEEE Conference on Cloud and Big Data Computing (CBDCom 2017) (2017)Google Scholar
  20. 20.
    Ullrich, M., Lässig, J., Gaedke, M.: Towards efficient resource management in cloud computing: a survey. In: The IEEE 4th International Conference on Future Internet of Things and Cloud (FiCloud 2016) (2016)Google Scholar
  21. 21.
    Varghese, B., Buyya, R.: Next generation cloud computing: new trends and research directions. Future Gener. Comput. Syst. 79, 849–861 (2018). Scholar
  22. 22.
    Volkov, S., Sukhoroslov, O.: Simplifying the use of clouds for scientific computing with everest. Procedia Comput. Sci. 119, 112–120 (2017). 6th International Young Scientist Conference on Computational Science, YSC 2017, Kotka, Finland, 01–03 November 2017CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Markus Ullrich
    • 1
    Email author
  • Jörg Lässig
    • 1
  • Jingtao Sun
    • 2
  • Martin Gaedke
    • 3
  • Kento Aida
    • 2
  1. 1.University of Applied Sciences Zittau/GörlitzGörlitzGermany
  2. 2.National Institute of InformaticsChiyoda-kuJapan
  3. 3.Technische Universität ChemnitzChemnitzGermany

Personalised recommendations