Simple Indicators for Tracking Software Process Improvement Progress

  • Anna Börjesson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4257)


We know from the software process improvement (SPI) literature that new technologies are often acquired, but not deployed. Fichmand and Kemerer call this phenomenon the assimilation gap. Important prerequisites to SPI success are SPI implementation success and SPI initiative progress. This study presents four simple and practical indicators for SPI initiatives to stay focused on deployment and facilitate SPI initiative progress. These practical indicators are easy to gather, manage and evaluate and they provide an organization with useful information to determine the progress of an SPI initiative. The indicators focus on competence build-up, employee capabilities, process adoption and management commitment. The result shows there are simple and practical indicators for tracking and follow-up SPI initiatives’ progress to stay focused on deployment and decrease the assimilation gap.


Development Unit Process Adoption Simple Indicator Information System Research Management Commitment 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abrahamsson, P.: Measuring the Success of Software Process Improvement: The Dimensions. In: EuroSPI 2000, Copenhagen, Denmark (2000a)Google Scholar
  2. 2.
    Abrahamsson, P.: Is Management Commitment a Necessity After All in III: Software Process Improvement? In: Euromicro 2000, Maastricht, The Netherlands, pp. 246–253. IEEE Computer Society, Los Alamitos (2000b)Google Scholar
  3. 3.
    Abrahamsson, P.: Rethinking the Concept of Commitment in Software Process Improvement. Scandinavian Journal of Information Systems 13, 69–98 (2001)Google Scholar
  4. 4.
    Albrecht, A.J.: Measuring Application Development Productivity. In: Proceedings of the IBM Application Development Symposium, Montery, California, pp. 83–92 (October 1979)Google Scholar
  5. 5.
    Albrecht, A.J., Gaffney Jr., J.E.: Software Function, Source Lines of Code, and Development Effort Prediction: A Software Science Validation. IEEE Transactions on Software Engineering SE-9(6), 639–648 (1993)CrossRefGoogle Scholar
  6. 6.
    Argyris, C., Schön, D.: Organizational Learning. Addison-Wesley, Reading (1978)Google Scholar
  7. 7.
    Attewell, P.: Technolgy Diffusion and Organisational Learning: The Case of Business Computing. Organization Science 3(1), 1–19 (1992)CrossRefGoogle Scholar
  8. 8.
    Bach, J.: Enough About Process: What We Need are Heroes. IEEE Software 12(2), 96–98 (1995)CrossRefGoogle Scholar
  9. 9.
    Bartunek, J.M., Louis, M.R.: Insider/outsider Team Research, Qualitative Research Methods, vol. 40. Sage Publications, Thousand Oaks (1996)Google Scholar
  10. 10.
    Basili, V.G., Caldiera, G., Rombach, H.D.: Goal Question Metric Approach, Encyclopedia of Software Engineering, pp. 528–532. John Wiley & Sons, Inc., Chichester (1994)Google Scholar
  11. 11.
    Baskerville, R., Pries-Heje, J.: Grounded action research: a method for understanding IT in practice. Management and Information Technology 9, 1–23 (1999)CrossRefGoogle Scholar
  12. 12.
    Baskerville, R., Wood-Harper, T.: A critical perspective on action research as a method for information systems research. Journal of Information Technology 11, 235–246 (1996)CrossRefGoogle Scholar
  13. 13.
    Bollinger, T.B., McGowan, C.: A Critical Look at Software Capability Evaluations. IEEE Software 8(4), 25–41 (1991)CrossRefGoogle Scholar
  14. 14.
    Brown, M., Goldenson, D.: Measurement and Analysis: What Can and Does Go Wrong? In: The 10th International Symposium on Software Metrics (September 14, 2004)Google Scholar
  15. 15.
    Börjesson, A., Mathiassen, L.: Successful Process Implementation. IEEE Software 21(4), 36–44 (2004)CrossRefGoogle Scholar
  16. 16.
    Börjesson, A., Martinsson, F., Timmerås, M.: Using Agile Improvement Practices in Software Organizations. European Journal of Information Systems (2005) (accepted)Google Scholar
  17. 17.
    Davison, R., Maris, Martinsons, M., Kock, N.: Principles of canonical action research. Info. Systems Journal 14, 65–86 (2004)CrossRefGoogle Scholar
  18. 18.
    Dove, R.: Response Ability – The Language, Structure, and Culture of the Agile Enterprise. Wiley, New York (2001)Google Scholar
  19. 19.
    Fichman, R.G., Kemerer, C.F.: The Illusory Diffusion of Innovation: An Examination of Assimlation Gaps. Information Systems Research 10(3), 255–275 (1999)CrossRefGoogle Scholar
  20. 20.
    Fayad, M.E., Laitinen, M.: Process Assessment Considered Wasteful. Communications of the ACM 40(11), 125–128 (1997)CrossRefGoogle Scholar
  21. 21.
    Flaherty, M.J.: Programming Process Productivity Measurement System for System 370. IBM System Journal 24(2) (1985)Google Scholar
  22. 22.
    Galliers, R.D.: Choosing an Information Systems Research Approach. In: Galliers (ed.) Information Systems Research: Issues, Methods, and Practical Guidelines, pp. 144–162. Blackwell Scientific Publications, Oxford (1992)Google Scholar
  23. 23.
    Grady, R.B.: Practical Software Metrics for Project Management and Process Improvement. Prentice Hall, Upper Saddle River (1992)Google Scholar
  24. 24.
    Grady, R.B.: Successful Software Process Improvement. Prentice Hall, Upper Saddle River (1997)Google Scholar
  25. 25.
    Goethert, W., Siviy, J.: Applications of the Indicator Template for Measurement and Analysis, Technical Note CMU/SEI-2004-TN-024 (2004)Google Scholar
  26. 26.
    Jones, C.: Sources of Errors in Software Cost Estimating, version 1.0, Software Productivity research, Burlington, MA 01803 (November 24, 1993)Google Scholar
  27. 27.
    Jones, C.: Assessment and Control of Software Risks. Prentice Hall, Englewood Cliffs (1994)Google Scholar
  28. 28.
    Haeckel, S.H.: Adaptive Enterprise: Creating and Leading Sense-and-Respond Organizations. Harvard Business School Press, Boston (1999)Google Scholar
  29. 29.
    Humphrey, W.S.: The IBM Large-System Software Development Process: Objectives and Directions. IBM Systems Journal 24(2) (1985)Google Scholar
  30. 30.
    Humphrey, W.S.: Managing the Software Process. Addison Wesley, Reading (1989)Google Scholar
  31. 31.
    McFeeley, B.: IDEAL. A User’s Guide for Software Process Improvement, The Software Engineering Institute, Carnegie Mellon University, Pittsburgh, Handbook CMU/SEI-96-HB-001 (1996)Google Scholar
  32. 32.
    Paulk, M.C., Weber, C.V., Curtis, B., Chrissis, M.B.: The Capability Maturity Model: Guidelines for Improving the Software Process. Addison-Wesley Pub. Co., Reading (1995)Google Scholar
  33. 33.
    SEMA Process Maturity Profile of the Software Community, Software Engineering Institute, Carnegie-Mellon University (2002)Google Scholar
  34. 34.
    Weinberg, G.M.: Quality Software Management. First-Order Measurement, vol. II. Dorset House Publishing, New York (1993)Google Scholar
  35. 35.
    Yin, R.: Case Study Research. Sage Publication, Newburry Park (1994)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Anna Börjesson
    • 1
  1. 1.Ericsson AB and IT University of GothenburgGothenburgSweden

Personalised recommendations