Metrics and Benchmarks for Self-aware Computing Systems

  • Nikolas HerbstEmail author
  • Steffen Becker
  • Samuel Kounev
  • Heiko Koziolek
  • Martina Maggio
  • Aleksandar Milenkoski
  • Evgenia Smirni


In this chapter, we propose a list of metrics grouped by the MAPE-K paradigm for quantifying properties of self-aware computing systems. This set of metrics can be seen as a starting point toward benchmarking and comparing self-aware computing systems on a level-playing field. We discuss state-of-the art approaches in the related fields of self-adaptation and self-protection to identify commonalities in metrics for self-aware computing. We illustrate the need for benchmarking self-aware computing systems with the help of an approach that uncovers real-time characteristics of operating systems. Gained insights of this approach can be seen as a way of enhancing self-awareness by a measurement methodology on an ongoing basis. At the end of this chapter, we address new challenges in reference workload definition for benchmarking self-aware computing systems, namely load intensity patterns and burstiness modeling.


Virtual Machine Intrusion Detection Intrusion Detection System Load Intensity Interarrival Time 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    LIMBO: Load Intensity Modeling Framework., 2015.
  2. 2.
    Rodrigo F Almeida, Flávio RC Sousa, Sérgio Lifschitz, and Javam C Machado. On Defining Metrics for Elasticity of Cloud Databases. In Proceedings of the 28th Brazilian Symposium on Databases, 2013.Google Scholar
  3. 3.
    Enrico Bini, Marko Bertogna, and Sanjoy Baruah. Virtual multiprocessor platforms: Specification and use. In Proceedings of the 2009 30th IEEE Real-Time Systems Symposium, pages 437–446, 2009.Google Scholar
  4. 4.
    Robert Birke, Mathias Björkqvist, Lydia Y. Chen, Evgenia Smirni, and Ton Engbersen. (big)data in a virtualized world: volume, velocity, and variety in cloud datacenters. In Proceedings of the 12th USENIX conference on File and Storage Technologies, FAST 2014, Santa Clara, CA, USA, February 17-20, 2014, pages 177–189, 2014.Google Scholar
  5. 5.
    Robert Birke, Andrej Podzimek, Lydia Y. Chen, and Evgenia Smirni. State-of-the-practice in data center virtualization: Toward a better understanding of VM usage. In 2013 43rd Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), Budapest, Hungary, June 24-27, 2013, pages 1–12, 2013.Google Scholar
  6. 6.
    Giuliano Casale, Eddy Z. Zhang, and Evgenia Smirni. Kpc-toolbox: Best recipes for automatic trace fitting using markovian arrival processes. Perform. Eval., 67(9):873–896, 2010.CrossRefGoogle Scholar
  7. 7.
    Giuliano Casale, Eddy Z. Zhang, and Evgenia Smirni. Trace data characterization and fitting for markov modeling. Perform. Eval., 67(2):61–79, 2010.CrossRefGoogle Scholar
  8. 8.
    Dean Chandler, Nurcan Coskun, Salman Baset, Erich Nahum, Steve Realmuto Masud Khandker, Tom Daly, Nicholas Wakou Indrani Paul, Louis Barton, Mark Wagner, Rema Hariharan, and Yun seng Chao. Report on Cloud Computing to the OSG Steering Committee. Technical report, April 2012.Google Scholar
  9. 9.
    Brian F. Cooper, Adam Silberstein, Erwin Tam, Raghu Ramakrishnan, and Russell Sears. Benchmarking cloud serving systems with YCSB. In Proceedings of the 1st ACM symposium on Cloud computing, SoCC ’10, pages 143–154, New York, NY, USA, 2010. ACM.Google Scholar
  10. 10.
    Thibault Dory, Boris Mejías, Peter Van Roy, and Nam-Luc Tran. Measuring Elasticity for Cloud Databases. In Proceedings of the The Second International Conference on Cloud Computing, GRIDs, and Virtualization, 2011.Google Scholar
  11. 11.
    Enno Folkerts, Alexander Alexandrov, Kai Sachs, Alexandru Iosup, Volker Markl, and Cafer Tosun. Benchmarking in the Cloud: What It Should, Can, and Cannot Be. In Raghunath Nambiar and Meikel Poess, editors, Selected Topics in Performance Evaluation and Benchmarking, volume 7755 of Lecture Notes in Computer Science, pages 173–188. Springer Berlin Heidelberg, 2012.CrossRefGoogle Scholar
  12. 12.
    John E. Gaffney and Jacob W. Ulvila. Evaluation of intrusion detectors: a decision theory approach. In Proceedings of the 2001 IEEE Symposium on Security and Privacy, pages 50–61, 2001.Google Scholar
  13. 13.
    Guofei Gu, Prahlad Fogla, David Dagon, Wenke Lee, and Boris Skorić. Measuring intrusion detection capability: an information-theoretic approach. In Proceedings of the 2006 ACM Symposium on Information, computer and communications security (ASIACCS), pages 90–101, New York, NY, USA, 2006. ACM.Google Scholar
  14. 14.
    R. Gusella. Characterizing the variability of arrival processes with indexes of dispersion. IEEE JSAC, 19(2):203–211, 1991.Google Scholar
  15. 15.
    Mike Hall and Kevin Wiley. Capacity verification for high speed network intrusion detection systems. In Proceedings of the 5th International Conference on Recent Advances in Intrusion Detection (RAID), pages 239–251, Berlin, Heidelberg, 2002. Springer-Verlag.Google Scholar
  16. 16.
    Amin Hassanzadeh and Radu Stoleru. Towards Optimal Monitoring in Cooperative IDS for Resource Constrained Wireless Networks. In Proceedings of 20th International Conference on Computer Communications and Networks (ICCCN), pages 1–8, August 2011.Google Scholar
  17. 17.
    Nikolas Roman Herbst, Samuel Kounev, and Ralf Reussner. Elasticity in Cloud Computing: What it is, and What it is Not (short paper). In Proceedings of the 10th International Conference on Autonomic Computing (ICAC 2013). USENIX, June 2013.Google Scholar
  18. 18.
    Nikolas Roman Herbst, Samuel Kounev, Andreas Weber, and Henning Groenda. BUNGEE: An Elasticity Benchmark for Self-adaptive IaaS Cloud Environments. In Proceedings of the 10th International Symposium on Software Engineering for Adaptive and Self-Managing Systems, SEAMS ’15, pages 46–56, Piscataway, NJ, USA, 2015. IEEE Press.Google Scholar
  19. 19.
    Karl Huppler. Performance Evaluation and Benchmarking. chapter The Art of Building a Good Benchmark, pages 18–30. Springer-Verlag, Berlin, Heidelberg, 2009.Google Scholar
  20. 20.
    Karl Huppler. Benchmarking with Your Head in the Cloud. In Raghunath Nambiar and Meikel Poess, editors, Topics in Performance Evaluation, Measurement and Characterization, volume 7144 of Lecture Notes in Computer Science, pages 97–110. Springer Berlin Heidelberg, 2012.CrossRefGoogle Scholar
  21. 21.
    Sadeka Islam, Kevin Lee, Alan Fekete, and Anna Liu. How a Consumer Can Measure Elasticity for Cloud Platforms. In Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering, ICPE ’12, pages 85–96, New York, NY, USA, 2012. ACM.Google Scholar
  22. 22.
    Jóakim V. Kistowski, Nikolas Herbst, Daniel Zoller, Samuel Kounev, and Andreas Hotho. Modeling and extracting load intensity profiles. In Proceedings of the 10th International Symposium on Software Engineering for Adaptive and Self-Managing Systems, SEAMS ’15, pages 109–119, Piscataway, NJ, USA, 2015. IEEE Press.Google Scholar
  23. 23.
    Anita Komlodi, John R. Goodall, and Wayne G. Lutters. An Information Visualization Framework for Intrusion Detection. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems, page 1743, New York, NY, USA, 2004. ACM.Google Scholar
  24. 24.
    Michael Kuperberg, Nikolas Roman Herbst, Joakim Gunnarson von Kistowski, and Ralf Reussner. Defining and Quantifying Elasticity of Resources in Cloud Computing and Scalable Platforms. Technical report, Karlsruhe Institute of Technology (KIT), 2011.Google Scholar
  25. 25.
    Hennadiy Leontyev, Samarjit Chakraborty, and James H. Anderson. Multiprocessor extensions to real-time calculus. Real-Time Syst., 47(6):562–617, December 2011.Google Scholar
  26. 26.
    Zheng Li, L. O’Brien, He Zhang, and R. Cai. On a Catalogue of Metrics for Evaluating Commercial Cloud Services. In Grid Computing (GRID), 2012 ACM/IEEE 13th International Conference on, pages 164–173, Sept 2012.Google Scholar
  27. 27.
    Martina Maggio, Juri Lelli, and Enrico Bin. Analysis of os schedulers with rt-muse. In RTSS@Work (Real-Time Systems Symposium Demo Session), 2015.Google Scholar
  28. 28.
    Roy A. Maxion and Kymie M.C. Tan. Benchmarking anomaly-based detection systems. In Proceedings of the International Conference on Dependable Systems and Networks (DSN), pages 623–630, 2000.Google Scholar
  29. 29.
    Yuxin Meng. Measuring intelligent false alarm reduction using an ROC curve-based approach in network intrusion detection. In IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA), pages 108–113, July 2012.Google Scholar
  30. 30.
    Ningfang Mi, Giuliano Casale, Ludmila Cherkasova, and Evgenia Smirni. Burstiness in multi-tier applications: Symptoms, causes, and new models. In ACM/IFIP/USENIX 9th International Middleware Conference (Middleware’08), Leuven, Belgium, 2008. The prelimilary paper appeared in the HotMetrics 2008 Workshop.Google Scholar
  31. 31.
    Ningfang Mi, Giuliano Casale, Ludmila Cherkasova, and Evgenia Smirni. Injecting realistic burstiness to a traditional client-server benchmark. In Proceedings of the 6th International Conference on Autonomic Computing, ICAC 2009, June 15-19, 2009, Barcelona, Spain, pages 149–158, 2009.Google Scholar
  32. 32.
    Ningfang Mi, Qi Zhang, Alma Riska, Evgenia Smirni, and Erik Riedel. Performance impacts of autocorrelated flows in multi-tiered systems. Perform. Eval., 64(9-12):1082–1101, 2007.CrossRefGoogle Scholar
  33. 33.
    M. F. Neuts. Structured Stochastic Matrices of M/G/1 Type and Their Applications. Marcel Dekker, New York, 1989.zbMATHGoogle Scholar
  34. 34.
    Object Management Group, Inc. UML Profile for Schedulability, Performance, and Time (SPT), version 1.1., 2005.
  35. 35.
    Object Management Group, Inc. UML profile for MARTE: Modeling and Analysis of Real-Time Embedded Systems, version 1.1., 2011.
  36. 36.
    Dorin Bogdan Petriu and C. Murray Woodside. An intermediate metamodel with scenarios and resources for generating performance models from UML designs. Springer Software and System Modeling (SoSym), 6(2):163–184, 2007.Google Scholar
  37. 37.
    Martin Roesch. Snort - Lightweight Intrusion Detection for Networks. In Proceedings of the 13th USENIX conference on System Administration (LISA), pages 229–238. USENIX Association, 1999.Google Scholar
  38. 38.
    Bianca Schroeder, Adam Wierman, and Mor Harchol-Balter. Open versus closed: A cautionary tale. In Proceedings of the 3rd conference on Networked Systems Design & Implementation (NSDI ’06), pages 18–18. USENIX Association, 2006.Google Scholar
  39. 39.
    Jaydip Sen, Arijit Ukil, Debasis Bera, and Arpan Pal. A distributed intrusion detection system for wireless ad hoc networks. In 16th IEEE International Conference on Networks (ICON), pages 1–6, 2008.Google Scholar
  40. 40.
    Connie U. Smith and Lloyd G. Williams. Performance Solutions: A practical guide to creating responsive, scalable software. Addison-Wesley, 2002.Google Scholar
  41. 41.
    Christian Tinnefeld, Daniel Taschik, and Hasso Plattner. Quantifying the Elasticity of a Database Management System. In DBKDA 2014, The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications, pages 125–131, 2014.Google Scholar
  42. 42.
    Joe Weinman. Time is Money: The Value of “On-Demand”, 2011. (accessed July 9, 2014).Google Scholar
  43. 43.
    Qi Zhang, Ludmila Cherkasova, and Evgenia Smirni. A regression-based analytic model for dynamic resource provisioning of multi-tier applications. In Fourth International Conference on Autonomic Computing (ICAC’07), Jacksonville, Florida, USA, June 11-15, 2007, page 27, 2007.Google Scholar
  44. 44.
    S. Zhang, Z. Qian, Z. Luo, J. Wu, and S. Lu. Burstiness-aware resource reservation for server consolidation in computing clouds. IEEE Trnascations on Parallel and Distributed Systems, 27(4):964–977, 2016.Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Nikolas Herbst
    • 1
    Email author
  • Steffen Becker
    • 2
  • Samuel Kounev
    • 1
  • Heiko Koziolek
    • 3
  • Martina Maggio
    • 4
  • Aleksandar Milenkoski
    • 1
  • Evgenia Smirni
    • 5
  1. 1.University of WürzburgWürzburgGermany
  2. 2.Technical University ChemnitzChemnitzGermany
  3. 3.ABB LadenburgLadenburgGermany
  4. 4.Lunds UniversitetLundSweden
  5. 5.College of William and MaryWilliamsburgUSA

Personalised recommendations