Skip to main content

A measurement framework for assessing the maturity of requirements engineering process

Abstract

Because requirements engineering (RE) problems are widely acknowledged as having a major impact on the effectiveness of the software development process, Sommerville et al. have developed a requirements maturity model. However, research has shown that the measurement process within Sommerville’s model is ambiguous, and implementation of his requirements maturity model leads to confusion. Hence, the objective of our research is to propose a new RE maturity measurement framework (REMMF) based on Sommerville’s model and to provide initial validation of REMMF. The main purpose of proposing REMMF is to allow us to more effectively measure the maturity of the RE processes being used within organisations and to assist practitioners in measuring the maturity of their RE processes. In order to evaluate REMMF, two organisations implemented the measurement framework within their IT divisions, provided us with an assessment of their requirements process and gave feedback on the REMMF measurement process. The results show that our measurement framework is clear, easy to use and provides an entry point through which the practitioners can effectively judge the strengths and weakness of their RE processes. When an organisation knows where it is, it can more effectively plan for improvement.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

References

  1. Alexander, I., & Stevens, R. (2002). Writing better requirements. Addison-Wesley.

  2. Beecham, S., & Hall, T. (2003). Expert panel questionnaire: Validating a requirements process improvement model, http://www.homepages.feis.herts.ac.uk/∼pppgroup/requirements_cmm.htm, Site visited May 2003.

  3. Beecham, S., Hall, T., & Rainer, A. (2003a). Building a requirements process improvement model. Department of Computer Science, University of Hertfordshire, Technical report No: 378.

  4. Beecham, S., Hall, T., & Rainer, A. (2003b). Software process problems in twelve software companies: An empirical analysis. Empirical Software Engineering, 8, 7–42.

    MATH  Article  Google Scholar 

  5. Boehm, B. W. (1987). Improving software productivity. IEEE Computer, 20(9), 43–57.

    Google Scholar 

  6. Briand, L., Wust, J., & Lounis, H. (2001). Replicated case studies for investigating quality factors in object oriented designs. Empirical Software Engineering, 6(1), 11–58.

    MATH  Article  Google Scholar 

  7. Chatzoglou, P., & Macaulay, L. (1996). Requirements capture and analysis: A survey of current practice. Requirements Engineering Journal, 1, 75–87.

    Article  Google Scholar 

  8. Cooper, D., & Schindler, P. (2001). Business research methods (7th ed.). McGraw-Hill.

  9. Daskalantonakis, M. K. (1994). Achieving higher SEI levels. IEEE Software, 11(4), 17–24.

    Article  Google Scholar 

  10. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340.

    Article  Google Scholar 

  11. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35, 982–1003.

    Article  Google Scholar 

  12. Diaz, M., & Sligo, J. (1997). How software process improvement helped Motorola. IEEE Software, 14(5), 75–81.

    Article  Google Scholar 

  13. El Emam, K., & Madhavji, H. N. (1995). A field study of requirements engineering practices in information systems development. In Second International Symposium on Requirements Engineering (pp. 68–80).

  14. Gorscheck, T., Svahnberg, M., & Kaarina, T. (2003). Introduction and application of a lightweight requirements engineering process evaluation method. In Proceedings of the Requirements Engineering Foundations for Software Quality (REFSQ’03) (pp. 83–92). Klagenfurt/Velden, Austria.

  15. Hall, T., Beecham, S., & Rainer, A. (2002). Requirements problems in twelve software companies: An empirical analysis. IEE Proceedings—Software, 149(5), 153–160.

    Google Scholar 

  16. Hoffmann, H., & Lehner, F. (2001). Requirements engineering as a success factor in software projects. IEEE Software, 18(4), 58–66.

    Google Scholar 

  17. Humphery, W. S. (2002). Three process perspectives: Organizations, teams, and people. Annuls of Software Engineering, 14, 39–72.

    Article  Google Scholar 

  18. Jobserve.com. UK Wasting Billions on IT Projects, http://www.jobserve.com/news/NewsStory.asp?e=e&SID=SID2598, 21/4/2004.

  19. Kamsties, E., Hormann, K., & Schlich M. (1998). Requirements engineering in small and medium enterprises. Requirements Engineering, 3(2), 84–90.

    Article  Google Scholar 

  20. Kauppinen, M., Aaltio, T., & Kujala, S. (2002). Applying the requirements engineering good practice guide for process improvement. In Proceedings of the Seventh European Conference on Software Quality (QC2002) (pp. 45–55).

  21. MacDonell, S., & Shepperd, M. (2003). Using prior-phase effort records for re-estimation during software projects. In 9th International Symposium on Software Metrics (pp. 73–86). 3–5 Sept., Sydney, Australia.

  22. Marjo, K., & Sari, K. (2001). Assessing requirements engineering processes with the REAIMS model: Lessons learned. In Proceedings of the Eleventh Annual International Symposium of the International Council on Systems Engineering (INCOSE2001).

  23. Neill, C. J., & Laplante, P. A. (2003). Requirements engineering: State of the practice. IEEE Software, 18(4), 40–45.

    Google Scholar 

  24. Ngwenyama, O., & Nielsen, P. A. (2003). Competing values in software process improvement: An assumption analysis of CMM from an organizational culture perspective. IEEE Transactions on Software Engineering, 50, 100–112.

    Google Scholar 

  25. Niazi, M. (2004). A framework for assisting the design of effective software process improvement implementation strategies. Ph.D. thesis, University of Technology Sydney.

  26. Niazi, M. (2005a). An empirical study for the improvement of requirements engineering process. In The 17th International Conference on Software Engineering and Knowledge Engineering (pp. 396–399). July 14 to 16, 2005, Taipei, Taiwan, Republic of China.

  27. Niazi, M. (2005b). An instrument for measuring the maturity of requirements engineering process. In The 6th International Conference on Product Focused Software Process Improvement (pp. 574–585). LNCS, Oulu, Finland, June 13–16.

  28. Niazi, M., Cox, K., & Verner, J. (2005a). An empirical study identifying high perceived value requirements engineering practices. In Fourteenth International Conference on Information Systems Development (ISD’2005). Karlstad University, Sweden August 15–17.

  29. Niazi, M., & Shastry, S. (2003). Role of requirements engineering in software development process: An empirical study. In IEEE International Multi-Topic Conference (INMIC03) (pp. 402–407).

  30. Niazi, M., Wilson, D., & Zowghi, D. (2005b). A framework for assisting the design of effective software process improvement implementation strategies. Journal of Systems and Software, 78(2), 204–222.

    Article  Google Scholar 

  31. Niazi, M., Wilson, D., & Zowghi, D. (2005c). A maturity model for the implementation of software process improvement: An empirical study. Journal of Systems and Software, 74(2), 155–172.

    Article  Google Scholar 

  32. Nikula, U., Fajaniemi, J., & Kalviainen, H. (2000). Management view on current requirements engineering practices in small and medium enterprises. In Fifth Australian Workshop on Requirements Engineering (pp. 81–89).

  33. Nuseibeh, B., & Easterbrook, S. (2000). Requirements engineering: A roadmap. In 22nd International Conference on Software Engineering (pp. 35–46).

  34. Regnell, B., Runeson, P., & Thelin, T. (2000). Are the perspectives really different-further experimentation on scenario-based reading of requirements. Empirical Software Engineering, 5(4), 331–356.

    MATH  Article  Google Scholar 

  35. Sawyer, P., Sommerville, I., & Viller, S. (1997). Requirements process improvement through the phased introduction of good practice. Software Process—Improvement and Practice, 3, 19–34.

    Google Scholar 

  36. SCAMPI. (2001). Standard CMMI® appraisal method for process improvement (SCAMPISM), Version 1.1: Method Definition Document. SEI, CMU/SEI-2001-HB-001.

  37. Siddiqi, J., & Chandra, S. (1996). Requirements engineering: The emerging wisdom. IEEE Software, 13(2), 15–19.

    Article  Google Scholar 

  38. Sommerville, I. (1996). Software engineering (5th ed.). Addison-Wesley.

  39. Sommerville, I., & Ransom, J. (2005). An empirical study of industrial requirements engineering process assessment and improvement. ACM Transactions on Software Engineering and Methodology, 14(1), 85–117.

    Article  Google Scholar 

  40. Sommerville, I., & Sawyer, P. (1997). Requirements engineering—a good practice guide. Wiley.

  41. Sommerville, I., Sawyer, P., & Viller, S. (1998). Improving the requirements process. In Fourth International Workshop on Requirements Engineering: Foundation of Software Quality (pp. 71–84).

  42. Standish-Group. (1995). Chaos—the state of the software industry. Standish group international technical report, pp. 1–11.

  43. Standish-Group. (1999). Chaos: A recipe for success. Standish Group International.

  44. Standish-Group. (2003). Chaos—the state of the software industry.

  45. Verner, J., Cox, K., Bleistein, S., & Cerpa, N. (2005). Requirements engineering and software project success: An industrial survey in Australia and the US. Australian Journal of Information Systems (to appear Sept 2005).

  46. Verner, J., & Evanco, W. M. (2005). In-house software development: What software project management practices lead to success? IEEE Software, 22(1), 86–93.

    Article  Google Scholar 

  47. Wiegers, K. E. (2003). Software requirements (2nd ed.). Redmond, WA: Microsoft Press.

    Google Scholar 

  48. Yin, R. K. (1993). Applications of case study research. Sage Publications.

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Mahmood Niazi.

Appendices

Appendix A: An example of requirements category assessment

The following example shows how REMMF measures the capability of the ‘describing requirements’ category. The practices listed in Table 3 define the describing requirements category. Practice “DR1” is highlighted as an example.

Table 3 Measuring capability example

Three elements of each RE practice are measured: the approach, the deployment and the results. The objective is to assess the strength of an individual RE practice as well the RE process category.

The first of the three measurement elements is based on the participant’s understanding of the organisation’s approach to the RE practices, i.e. the organisation’s commitment and management support for the practice as well as the organisation’s ability to implement the practice. Table 4 gives an example of how a participant might respond. The RE practice is as follows.

Table 4 Approach

DR1: Define standard templates for describing requirements

The respondent should tick one of the options in the ‘Score’ column. Using their expert understanding and by collecting relevant information from different sources, imagine the respondent selecting: Weak (2) (i.e. Management begins to recognize need).

The second element assesses how a practice is deployed in the organisation, i.e. the breadth and consistency of practice implementation across project areas. Table 5 gives an example of how a participant might respond. The RE practice is as follows.

Table 5 Deployment

DR1: Define standard templates for describing requirements

The respondent needs to tick one of the options in the ‘Score’ column. Using their expert understanding and by collecting relevant information from different sources, imagine the respondent selects Fair (4) (i.e. Less fragmented use).

The last element assesses the breadth and consistency of positive results over time and across project areas (using that particular practice). Table 6 gives an example of how a participant might respond. The RE practice is as follows.

Table 6 Results

DR1: Define standard templates for describing requirements

The respondent should tick one of the options in the ‘Score’ column. Using his understanding and by collecting relevant information from different sources, imagine the respondent selects Marginally qualified (6) (i.e. Positive measurable results in most parts of the organisation).

Now the score of three elements is: 2 + 4 + 6/3 = 4. So we can say that the DR1 practice is not strong (i.e. <7) and can be considered at FAIR.

The above three elements are performed for all the RE practices in any particular requirements category. This procedure is repeated for each practice. The score for each practice is summed and an average is used to gain an overall score for each ‘requirements process category’.

Appendix B: Assessment summaries of all RE categories

ID Type Practice Organisation A Organisation B
The 3-dimensional scores of requirements documents practices
RD1 Basic Define a standard document structure 5 8
RD2 Basic Explain how to use the document 5 8
RD3 Basic Include a summary of the requirements 0 9
RD4 Basic Make a business case for the system 3 9
RD5 Basic Define specialized terms 5 7
RD6 Basic Make document layout readable 5 9
RD7 Basic Help readers find information 5 8
RD8 Basic Make the document easy to change 3 6
The 3-dimensional overall score of requirements document category 3.8 8
The 3-dimensional scores of requirements elicitation practices
RE1 Basic Assess system feasibility 7 6
RE2 Basic Be sensitive to organisational and political consideration 7 7
RE3 Basic Identify and consult system stakeholders 7 7
RE4 Basic Record requirements sources 5 5
RE5 Basic Define the system’s operating environment 6 8
RE6 Basic Use business concerns to drive requirements elicitation 6 8
RE7 Intermediate Look for domain constraints 6 8
RE8 Intermediate Record requirements rationale 0 7
RE9 Intermediate Collect requirements from multiple viewpoints 0 6
RE10 Intermediate Prototype poorly understood requirements 0 7
RE11 Intermediate Use scenarios to elicit requirements 0 8
RE12 Intermediate Define operational processes 4 6
RE13 Advanced Reuse requirements 0 8
The 3-dimensional overall score of requirements elicitation category 3.6 7
The 3-dimensional scores of requirements analysis and negotiation practices
RA1 Basic Define system boundaries 1 8
RA2 Basic Use checklists for requirements analysis 0 6
RA3 Basic Provide software to support negotiations 0 6
RA4 Basic Plan for conflicts and conflict resolution 0 8
RA5 Basic Prioritise requirements 0 9
RA6 Intermediate Classify requirements using a multi-dimensional approach 0 7
RA7 Intermediate Use interaction matrices to find conflicts and overlaps 0 6
RE8 Advanced Assess requirements risks 0 7
The 3-dimensional overall score of requirements negotiation category 0 7
The 3-dimensional scores of describing requirements practices
DR1 Basic Define standard templates for describing requirements 1 9
DR2 Basic Use languages simply and concisely 3 9
DR3 Basic Use diagrams appropriately 5 7
DR4 Basic Supplement natural language with other description of requirement 5 9
DR5 Intermediate Specify requirements quantitatively 0 7
The 3-dimensional overall score of describing requirements category 2.6 8
The 3-dimensional scores: systems modelling practices
SM1 Basic Develop complementary system models 6 6
SM2 Basic Model the system’s environment 6 7
SM3 Basic Model the system architecture 6 7
SM4 Intermediate Use structured methods for system modelling 2 6
SM5 Intermediate Use a data dictionary 0 8
SM6 Intermediate Document the links between stakeholders requirement and system models 0 7
The 3-dimensional overall score of systems modelling category 3.3 6.8
The 3-dimensional scores: requirements validation practices
RV1 Basic Check that the requirements document meets your standards 1 5
RV2 Basic Organise formal requirements inspections 0 5
RV3 Basic Use multi-disciplinary teams to review requirements 0 5
RV4 Basic Define validation checklists 0 6
RV5 Intermediate Use prototyping to animate requirements 0 5
RV6 Intermediate Write a draft user manual 8 7
RV7 Intermediate Propose requirements test cases 0 7
RV8 Advanced Paraphrase system models 1 6
The 3-dimensional overall score of requirements validation category 1.2 5.7
The 3-dimensional scores: requirements management practices
RM1 Basic Uniquely identify each requirement 0 8
RM2 Basic Define policies for requirements management 0 7
RM3 Basic Define traceability policies 0 6
RM4 Basic Maintain a traceability manual 0 6
RM5 Intermediate Use a database to manage requirements 0 7
RM6 Intermediate Define change management policies 0 8
RM7 Intermediate Identify global system requirements 0 7
RM8 Advanced Identify volatile requirements 0 7
RM9 Advanced Record rejected requirements 0 6
The 3-dimensional overall score of requirements management category 0 6.7

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Niazi, M., Cox, K. & Verner, J. A measurement framework for assessing the maturity of requirements engineering process. Software Qual J 16, 213–235 (2008). https://doi.org/10.1007/s11219-007-9033-4

Download citation

Keywords

  • Process maturity
  • Process improvement
  • Requirements engineering