Advertisement

Performance Analysis Strategies for Software Variants and Versions

  • Thomas ThümEmail author
  • André van Hoorn
  • Sven Apel
  • Johannes Bürdek
  • Sinem Getir
  • Robert Heinrich
  • Reiner Jung
  • Matthias Kowal
  • Malte Lochau
  • Ina Schaefer
  • Jürgen Walter
Open Access
Chapter

Abstract

This chapter is devoted to the performance analysis of configurable and evolving software. Both configurability and evolution imply a high degree of software variation, that is a large space of software variants and versions, that challenges state-of-the-art analysis techniques for software. We give an overview on strategies to cope with software variation, which mostly focuses either on configuration (variants) or evolution (versions). Interestingly, we found several directions where research on variants and versions can profit from one another.

References

  1. [AlH+16a]
    Mustafa Al-Hajjaji et al. “IncLing: Efficient Product-line Testing Using Incremental Pairwise Sampling”. In:Proceedings of the 2016 ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences. GPCE 2016. Amsterdam, Netherlands: ACM, 2016, pp. 144–155.isbn: 978-1-4503-4446-3.url:http://doi.acm.org/10.1145/2993236.2993253.
  2. [AlH+16b]
    Mustafa Al-Hajjaji et al. “Tool Demo: Testing Configurable Systems with FeatureIDE”. In:Proceedings of the 2016 ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences. GPCE 2016. Amsterdam, Netherlands: ACM, 2016, pp. 173–177.isbn: 978-1-4503-4446-3.url:http://doi.acm.org/10.1145/2993236.2993254.
  3. [Ape+13]
    Sven Apel et al.Feature-Oriented Software Product Lines: Concepts and Implementation Springer Publishing Company Incorporated, 2013.isbn: 9783642375200.Google Scholar
  4. [Bru+15]
    Andreas Brunnert et al.Performance-oriented DevOps: A Research Agenda. Tech. rep. SPEC-RG-2015-01. SPEC Research Group — DevOps Performance Working Group, Standard Performance Evaluation Corporation (SPEC), Aug. 2015.url:http://arxiv.org/abs/1508.04752.
  5. [CE00]
    Krzysztof Czarnecki and Ulrich W. Eisenecker.Generative Programming: Methods, Tools, and Applications. New York, NY, USA: ACM Press/Addison-Wesley Publishing Co., 2000.isbn: 0-201-30977-7.Google Scholar
  6. [Gre+14]
    Alexander Grebhahn et al. “Experiments on Optimizing the Performance of Stencil Codes with SPL Conqueror”. In:Parallel Processing Letters24.3 (2014).url:https://doi.org/10.1142/S0129626414410011.MathSciNetCrossRefGoogle Scholar
  7. [Gre+17]
    Alexander Grebhahn et al. “Performance-influence models of multigrid methods: A case study on triangular grids”. In:Concurrency and Computation: Practice and Experience 29.17 (2017).url: https://doi.org/10.1002/cpe.4057.CrossRefGoogle Scholar
  8. [Guo+13]
    Jianmei Guo et al. “Variability-aware Performance Prediction: A Statistical Learning Approach”. In:Proceedings of the 28th IEEE/ACM International Conference on Automated Software Engineering. ASE’13. Silicon Valley, CA, USA: IEEE Press, 2013, pp. 301–311.isbn: 978-1-4799-0215-6.url: https://doi.org/10.1109/ASE.2013.6693089.
  9. [Has+13]
    Wilhelm Hasselbring et al. iObserve:Integrated Observation and Modeling Techniques to Support Adaptation and Evolution of Software Systems. Forschungsbericht. Kiel, Germany: Kiel University, Oct. 2013.Google Scholar
  10. [Hei+14]
    Robert Heinrich et al. “Integrating Run-Time Observations and Design Component Models for Cloud System Analysis”. In:Proceedings of the 9th Workshop on Models@run.time. Vol. 1270. Workshop Proceedings. CEUR, Sept. 2014, pp. 41–46.Google Scholar
  11. [Hei+17b]
    Robert Heinrich et al. “Software Architecture for Big Data and the Cloud”. In: Elsevier, 2017. Chap. An Architectural Model-Based Approach to Quality-aware DevOps in Cloud Applications, pp. 69–89.Google Scholar
  12. [JHS13]
    Reiner Jung, Robert Heinrich, and Eric Schmieders. “Model-driven Instrumentation with Kieker and Palladio to forecast Dynamic Applications”. In:Proceedings Symposium on Software Performance: Joint Kieker/Palladio Days 2013 (KPDAYS 2013). Vol. 1083. CEUR Workshop Proceedings. CEUR, Nov. 2013, pp. 99–108.url:http://eprints.uni-kiel.de/22655/.
  13. [JW16]
    Reiner Jung and ChristianWulf. “Advanced Typing for the Kieker Instrumentation Languages”. In:Symposium on Software Performance 2016. Nov. 2016.url:http://eprints.uni-kiel.de/34626/.
  14. [Kan+90a]
    K. C. Kang et al.Feature-Oriented Domain Analysis (FODA) Feasibility Study. Tech. rep. Carnegie-Mellon Univ Pittsburgh Pa Software Engineering Inst, 1990.CrossRefGoogle Scholar
  15. [Kow+15]
    Matthias Kowal et al. “Scaling Size and Parameter Spaces in Variability-Aware Software Performance Models (T)”. In:Proceedings of the 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE). ASE ’15. Washington, DC, USA: IEEE Computer Society, 2015, pp. 407–417.isbn: 978-1-5090-0025-8.url: https://doi.org/10.1109/ASE.2015.16.
  16. [KST14]
    Matthias Kowal, Ina Schaefer, and Mirco Tribastone. “Family-Based Performance Analysis of Variant-Rich Software Systems”. In:International Conference on Fundamental Approaches to Software Engineering. Vol. 8411. Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2014.CrossRefGoogle Scholar
  17. [Mei+17]
    Jens Meinicke et al.Mastering Software Variability with FeatureIDE. BerlinHeidelberg: Springer, 2017.isbn: 978-3-319-61442-7.Google Scholar
  18. [Nai+17]
    Vivek Nair et al. “Using Bad Learners to Find Good Configurations”. In:Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering. ESEC/FSE 2017. Paderborn, Germany: ACM, 2017, pp. 257–267.isbn: 978-1-4503-5105-8.url:http://doi.acm.org/10.1145/3106237.3106238.
  19. [Sar+15]
    Atri Sarkar et al. “Cost-Efficient Sampling for Performance Prediction of Configurable Systems (T)”. In:Proceedings of the 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE). ASE ’15. Washington, DC, USA: IEEE Computer Society, 2015, pp. 342–352.isbn: 978-1-5090-0025-8.url:http://dx.doi.org/10.1109/ASE.2015.45.Google Scholar
  20. [Sie+12a]
    Norbert Siegmund et al. “Predicting Performance via Automated Feature-interaction Detection”. In:Proceedings of the 34th International Conference on Software Engineering. ICSE ’12. Zurich, Switzerland: IEEE Press, 2012, pp. 167–177.isbn: 978-1-4673-1067-3.url:http://dl.acm.org/citation.cfm?id=2337223.2337243.
  21. [Sie+13]
    Norbert Siegmund et al. “Scalable prediction of non-functional properties in software product lines: Footprint and memory consumption”. In:Information & Software Technology 55.3 (2013), pp. 491–507.url:https://doi.org/10.1016/j.infsof.2012.07.020.CrossRefGoogle Scholar
  22. [Sie+15]
    Norbert Siegmund et al. “Performance-influence Models for Highly Configurable Systems”. In:Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering. ESEC/FSE 2015. Bergamo, Italy: ACM, 2015, pp. 284–294.isbn: 978-1-4503-3675-8.url:http://doi.acm.org/10.1145/2786805.2786845.
  23. [Thü+14b]
    Thomas Thüm et al. “FeatureIDE: An Extensible Framework for Feature-oriented Software Development”. In:Sci. Comput. Program. 79 (Jan. 2014), pp. 70–85.issn: 0167-6423.url:http://dx.doi.org/10.1016/j.scico.2012.06.002.
  24. [Wal+17a]
    Jürgen Walter et al. “An Expandable Extraction Framework for Architectural Performance Models”. In:Proceedings of the 3rd International Workshop on Quality- Aware DevOps (QUDOS’17). ACM, Apr. 2017.Google Scholar
  25. [Wal+17b]
    Jürgen Walter et al. “Online Learning of Run-time Models for Performance and Resource Management in Data Centers”. In:Self-Aware Computing Systems. Springer Verlag, 2017.Google Scholar
  26. [WFP07]
    Murray Woodside, Greg Franks, and Dorina C. Petriu. “The Future of Software Performance Engineering”. In:2007 Future of Software Engineering. FOSE ’07. Washington, DC, USA: IEEE Computer Society, 2007, pp. 171–187.isbn: 0-7695-2829-5.url:http://dx.doi.org/10.1109/FOSE.2007.32.Google Scholar

Copyright information

© The Author(s) 2019

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  • Thomas Thüm
    • 1
    Email author
  • André van Hoorn
    • 2
  • Sven Apel
    • 3
  • Johannes Bürdek
    • 4
  • Sinem Getir
    • 5
  • Robert Heinrich
    • 6
  • Reiner Jung
    • 7
  • Matthias Kowal
    • 1
  • Malte Lochau
    • 4
  • Ina Schaefer
    • 1
  • Jürgen Walter
    • 8
  1. 1.Institute for Software Engineering and Automotive InformaticsTU BraunschweigBrunswickGermany
  2. 2.Institute of Software TechnologyUniversity of StuttgartStuttgartGermany
  3. 3.Chair of Software Engineering I, Department of Informatics and MathematicsUniversity of PassauPassauGermany
  4. 4.Technische Universität DarmstadtFachbereich Elektrotechnik und Informationstechnik, Fachgebiet EchtzeitsystemeDarmstadtGermany
  5. 5.Institut für Informatik, Johann-von-Neumann-HausHumboldt-Universität zu BerlinBerlinGermany
  6. 6.Institute for Program Structures and Data OrganizationKarlsruhe Institute of Technology (KIT)KarlsruheGermany
  7. 7.Software Engineering Group, Department of Computer ScienceKiel UniversityKielGermany
  8. 8.Chair of Computer Science IIUniversität WürzburgWürzburgGermany

Personalised recommendations