Quality of Measurement Programs

  • Miroslaw Staron
  • Wilhelm Meding


Controlling the development of large and complex software is usually done in a quantitative manner, using measurement as the foundation for decision making. Large projects usually collect large amounts of measures, although present only a few key ones for daily project, product, and organization monitoring. The process of collecting, analyzing and presenting the key information is usually supported by automated measurement systems. Since in this process there is a transition from a lot of information (data) to a small number of indicators (measures with decision criteria), the usual question which arises during discussions with managers is whether the stakeholders can trust the indicators w.r.t. the completeness, correctness of information and its timeliness—in other words, what is the quality of the measurement program? In this chapter, we present what characterizes high-quality measurement programs—namely completeness, correctness and information quality. We base this on our previous work in the area and describe how to calculate the completeness of a measurement program based on the product and process structure. After that we continue to describe the concept which is extremely important for the trust in measurements—information quality. Finally, we present the method for assessing the breadth of the measurement programs.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [BBNR05]
    Pierfrancesco Bellini, Ivan Bruno, Paolo Nesi, and Davide Rogai. Comparing fault-proneness estimation models. In Engineering of Complex Computer Systems, 2005. ICECCS 2005. Proceedings. 10th IEEE International Conference on, pages 205–214. IEEE, 2005.Google Scholar
  2. [BJA04]
    Michael Berry, Ross Jeffery, and Aybüke Aurum. Assessment of software measurement: An information quality study. In Software Metrics, 2004. Proceedings. 10th International Symposium on, pages 314–325. IEEE, 2004.Google Scholar
  3. [CVCP07]
    Ismael Caballero, Eugenio Verbo, Coral Calero, and Mario Piattini. A data quality measurement information model based on ISO/IEC 15939. In ICIQ, pages 393–408. Cambridge, MA, 2007.Google Scholar
  4. [DPKM97]
    S. De Panfilis, B. Kitchenham, and N. Morfuni. Experiences introducing a measurement program. Information and Software Technology, 39(11):745–754, 1997. TY - JOUR.CrossRefGoogle Scholar
  5. [GT95]
    Dale L Goodhue and Ronald L Thompson. Task-technology fit and individual performance. MIS quarterly, pages 213–236, 1995.Google Scholar
  6. [IM00]
    Jakob Iversen and Lars Mathiassen. Lessons from implementing a software metrics program. In System Sciences, 2000. Proceedings of the 33rd Annual Hawaii International Conference on, pages 11–pp. IEEE, 2000.Google Scholar
  7. [Int]
    International Standards Organization. Software engineering – Software product Quality Requirements and Evaluation (SQuaRE) – Data quality model. ISO/IEC.Google Scholar
  8. [ISO07]
    ISO/IEC. ISO/IEC 15939:2007 Systems and Software Engineering – Measurement Process, 2007.Google Scholar
  9. [JKC04]
    Ho-Won Jung, Seung-Gweon Kim, and Chang-Shin Chung. Measuring software product quality: A survey of iso/iec 9126. IEEE software, 21(5):88–92, 2004.CrossRefGoogle Scholar
  10. [Kil01]
    T. Kilpi. Implementing a software metrics program at Nokia. IEEE Software, 18(6):72–77, 2001.CrossRefGoogle Scholar
  11. [KL90]
    Barbara A Kitchenham and SJ Linkman. Design metrics in practice. Information and Software Technology, 32(4):304–310, 1990.CrossRefGoogle Scholar
  12. [KSW02]
    Beverly K Kahn, Diane M Strong, and Richard Y Wang. Information quality benchmarks: product and service performance. Communications of the ACM, 45(4):184–192, 2002.Google Scholar
  13. [LSKW02]
    Yang W Lee, Diane M Strong, Beverly K Kahn, and Richard Y Wang. Aimq: a methodology for information quality assessment. Information & management, 40(2):133–146, 2002.CrossRefGoogle Scholar
  14. [MS09]
    Wilhelm Meding and Miroslaw Staron. The role of design and implementation models in establishing mature measurement programs. In Nordic Workshop on Model Driven Engineering, Tampere, Finland, Tampere University of Technology, pages 284–299. Citeseer, 2009.Google Scholar
  15. [MW97]
    Donna Meyen and Mary Jane Willshire. A data quality engineering framework. In IQ, pages 95–116, 1997.Google Scholar
  16. [OC07]
    International Standard Organization and International Electrotechnical Commission. Software and systems engineering, software measurement process. Technical report, ISO/IEC, 2007.Google Scholar
  17. [oWM93]
    International Bureau of Weights and Measures. International vocabulary of basic and general terms in metrology. International Organization for Standardization, Geneva, Switzerland, 2nd edition, 1993.Google Scholar
  18. [PPB08]
    Gabriela Prelipcean, Nicolae Popoviciu, and Mircea Boscoianu. The role of predictability of financial series in emerging market applications. In Proceedings of the 9th WSEAS International Conference on Mathematics & Computers in Business and Economics (MCBE’80), pages 203–208, 2008.Google Scholar
  19. [PS05]
    Rosanne Price and Graeme Shanks. A semiotic information quality framework: Development and comparative analysis. Journal of Information Technology, 20(2):88–102, 2005.CrossRefGoogle Scholar
  20. [RK00]
    David M Raffo and Marc I Kellner. Empirical analysis in software process simulation modeling. Journal of Systems and Software, 53(1):31–41, 2000.CrossRefGoogle Scholar
  21. [SCT+04]
    Manuel Serrano, Coral Calero, Juan Trujillo, Sergio Luján-Mora, and Mario Piattini. Empirical validation of metrics for conceptual models of data warehouses. In International Conference on Advanced Information Systems Engineering, pages 506–520. Springer, 2004.Google Scholar
  22. [SFKM02]
    Erik Stensrud, Tron Foss, Barbara Kitchenham, and Ingunn Myrtveit. An empirical validation of the relationship between the magnitude of relative error and project size. In Software Metrics, 2002. Proceedings. Eighth IEEE Symposium on, pages 3–12. IEEE, 2002.Google Scholar
  23. [SM09a]
    Miroslaw Staron and Wilhelm Meding. Ensuring reliability of information provided by measurement systems. In Software Process and Product Measurement, pages 1–16. Springer, 2009.Google Scholar
  24. [SM09b]
    Miroslaw Staron and Wilhelm Meding. Ensuring reliability of information provided by measurement systems. In Software Process and Product Measurement, pages 1–16. Springer, 2009.Google Scholar
  25. [SM09c]
    Miroslaw Staron and Wilhelm Meding. Using models to develop measurement systems: a method and its industrial use. In Software Process and Product Measurement, pages 212–226. Springer, 2009.Google Scholar
  26. [SM14]
    Miroslaw Staron and Wilhelm Meding. Industrial self-healing measurement systems. In Continuous Software Engineering, pages 183–200. Springer, 2014.Google Scholar
  27. [SM16]
    Miroslaw Staron and Wilhelm Meding. MeSRAM – A method for assessing robustness of measurement programs in large software development organizations and its industrial evaluation. Journal of Systems and Software, 113:76–100, 2016.CrossRefGoogle Scholar
  28. [SMN08]
    Miroslaw Staron, Wilhelm Meding, and Christer Nilsson. A framework for developing measurement systems and its industrial evaluation. Information and Software Technology, 51(4):721–737, 2008.CrossRefGoogle Scholar
  29. [UE05]
    M. Umarji and H. Emurian. Acceptance issues in metrics program implementation. In H. Emurian, editor, 11th IEEE International Symposium Software Metrics, pages 10–17, 2005.Google Scholar
  30. [ZL06]
    Yuming Zhou and Hareton Leung. Empirical analysis of object-oriented design metrics for predicting high and low severity faults. IEEE Transactions on software engineering, 32(10):771–789, 2006.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Miroslaw Staron
    • 1
  • Wilhelm Meding
    • 2
  1. 1.Department of Computer Science and EngineeringUniversity of GothenburgGothenburgSweden
  2. 2.Ericsson ABGothenburgSweden

Personalised recommendations