Release Readiness Indicator for Mature Agile and Lean Software Development Projects

  • Miroslaw Staron
  • Wilhelm Meding
  • Klas Palm
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 111)


Large companies like Ericsson increasingly often adopt the principles of Agile and Lean software development and develop large software products in iterative manner – in order to quickly respond to customer needs. In this paper we present the main indicator which is sufficient for a mature software development organization in order to predict the time in weeks to release the product. In our research project we collaborated closely with a large Agile+Lean software development project at Ericsson in Sweden. This large and mature software development project and organization has found this main indicator – release readiness – to be so important that it was used as a key performance indicator and is used in controlling the development of the product and improving organizational performance. The indicator was developed and validated in an action research project at one of the units of Ericsson AB in Sweden in one of its largest projects.


Software Development International Standard Organization Action Research Project Large Software International Electrotechnical Commission 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Poppendieck, M., Poppendieck, T.: Implementing Lean Software Development: From Concept to Cash. Addison-Wesley, Boston (2007)Google Scholar
  2. 2.
    Salo, O., Abrahamsson, P.: Agile methods in European embedded software development organisations: a survey on the actual use and usefulness of Extreme Programming and Scrum. IET Software 2, 58–64 (2008)CrossRefGoogle Scholar
  3. 3.
    Korhonen, K.: Exploring Defect Data, Quality and Engagement during Agile Transformation at a Large Multisite Organization. In: Agile Processes in Software Engineering and Extreme Programming, pp. 88–102 (2010)Google Scholar
  4. 4.
    Staron, M., Meding, W.: Monitoring Bottlenecks in Agile and Lean Software Development Projects – A Method and Its Industrial Use. In: Product-Focused Software Process Improvement, Tore Cane, Italy, pp. 3–16 (2011)Google Scholar
  5. 5.
    Staron, M., Meding, W., Söderqvist, B.: A method for forecasting defect backlog in large streamline software development projects and its industrial evaluation. Information and Software Technology 52, 1069–1079 (2010)CrossRefGoogle Scholar
  6. 6.
    Gabrielle, B.: Rolling Out Agile in a Large Enterprise. In: Hawaii International Conference on System Sciences, pp. 462–462 (2008)Google Scholar
  7. 7.
    Korhonen, K.: Adopting Agile Practices in Teams with No Direct Programming Responsibility – A Case Study. In: Product-Focused Software Process Improvement, pp. 30–43 (2011)Google Scholar
  8. 8.
    Ball, T., Nagappan, N.: Use of relative code churn measures to predict system defect density. In: 27th International Conference on Software Engineering, St. Louis, MO, USA, pp. 284–292 (2005)Google Scholar
  9. 9.
    Staron, M., Meding, W.: Defect Inflow Prediction in Large Software Projects. e-Informatica Software Engineering Journal 4, 1–23 (2010)Google Scholar
  10. 10.
    Hartmann, D., Dymond, R.: Appropriate agile measurement: using metrics and diagnostics to deliver business value. In: Agile Conference, pp. 126–134 (2006)Google Scholar
  11. 11.
    Jeffries, R.: A Metric Leading to Agility (2004),
  12. 12.
    Fitz, T.: Continuous Deployment at IMVU: Doing the impossible fifty times a day (2009),
  13. 13.
    Chow, T., Cao, D.-B.: A survey study of critical success factors in agile software projects. Journal of Systems and Software 81, 961–971 (2008)CrossRefGoogle Scholar
  14. 14.
    Staron, M., Meding, W., Karlsson, G., Nilsson, C.: Developing measurement systems: an industrial case study. Journal of Software Maintenance and Evolution: Research and Practice, n/a–n/a (2010)Google Scholar
  15. 15.
    International Standard Organization and International Electrotechnical Commission. Software engineering – Software measurement process. ISO/IEC, Geneva (2002)Google Scholar
  16. 16.
    International Bureau of Weights and Measures. In: International vocabulary of basic and general terms in metrology = Vocabulaire international des termes fondamentaux et généraux de métrologie, 2nd edn., International Organization for Standardization, Genève (1993)Google Scholar
  17. 17.
    International Standard Organization. Systems engineering – System life cycle processes 15288:2002 (2002)Google Scholar
  18. 18.
    International Standard Organization. Information technology – Software product evaluation 14598-1:1999 (1999)Google Scholar
  19. 19.
    International Standard Organization and International Electrotechnical Commission. ISO/IEC 9126 - Software engineering – Product quality Part: 1 Quality model. International Standard Organization / International Electrotechnical Commission, Geneva (2001)Google Scholar
  20. 20.
    Umarji, M., Emurian, H.: Acceptance Issues in Metrics Program Implementation, pp. 20–20 (2005)Google Scholar
  21. 21.
    Gopal, A., Mukhopadhyay, T., Krishnan, M.S.: The impact of institutional forces on software metrics programs. IEEE Transactions on Software Engineering 31, 679–694 (2005)CrossRefGoogle Scholar
  22. 22.
    Umarji, M., Emurian, H.: Acceptance issues in metrics program implementation, p. 10 (2005)Google Scholar
  23. 23.
    Kilpi, T.: Implementing a Software Metrics Program at Nokia. IEEE Software 18, 72–77 (2001)CrossRefGoogle Scholar
  24. 24.
    Tomaszewski, P., Berander, P., Damm, L.-O.: From Traditional to Streamline Development - Opportunities and Challenges. Software Process Improvement and Practice, 1–20 (2007)Google Scholar
  25. 25.
    Akg, A.E., Keskin, H., Byrne, J., Imamoglu, S.Z.: Antecedents and consequences of team potency in software development projects. Inf. Manage. 44, 646–656 (2007)CrossRefGoogle Scholar
  26. 26.
    Liker, J.K.: The Toyota way: 14 management principles from the world’s greatest manufacturer. McGraw-Hill, New York (2004)Google Scholar
  27. 27.
    Staron, M., Meding, W., Nilsson, C.: A Framework for Developing Measurement Systems and Its Industrial Evaluation. Information and Software Technology 51, 721–737 (2008)CrossRefGoogle Scholar
  28. 28.
    Susman, G.I., Evered, R.D.: An Assessment of the Scientific Merits of Action Research. Administrative Science Quarterly 23, 582–603 (1978)CrossRefGoogle Scholar
  29. 29.
    Yin, R.K.: Case Study Research: Design and Methods. SAGE Publications Inc. (2008)Google Scholar
  30. 30.
    Staron, M., Meding, W.: Ensuring Reliability of Information Provided by Measurement Systems. In: Software Process and Product Measurement, pp. 1–16 (2009)Google Scholar
  31. 31.
    Staron, M., Meding, W.: Short-term Defect Inflow Prediction in Large Software Project - An Initial Evaluation. In: International Conference on Empirical Assessment in Software Engineering (EASE), Keele, UK (2007)Google Scholar
  32. 32.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslèn, A.: Experimentation in Software Engineering: An Introduction. Kluwer Academic Publisher, Boston MA (2000)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Miroslaw Staron
    • 1
  • Wilhelm Meding
    • 2
  • Klas Palm
    • 2
  1. 1.Software Centre, Computer Science and EngineeringChalmers / University of GothenburgGothenburgSweden
  2. 2.Ericsson Metrics Team, Ericsson Product DevelopmentEricsson ABSweden

Personalised recommendations