Skip to main content

A metric for quantifying the ripple effects among requirements


During software maintenance, it is often costlier to identify and understand the artifacts that need to be changed, rather than to actually apply the change. In addition to identifying the artifacts related to the change per se, one needs also to identify the artifacts that are changed due to ripple effects. In this paper, we focus on ripple effects and propose a metric for assessing the probability of one requirement to be affected by a change in another requirement (i.e., requirements ripple effect). We focus on the requirements level, since most maintenance tickets (which stem from the customer) are captured in natural language and therefore are more naturally mapped to requirements, rather than source code. The proposed metric—the requirements ripple effect measure (R2EM)—is calculated by considering the conceptual overlap between the involved requirements (through their past co-change), the parts of the code in which they are implemented (i.e., their overlapping implementations), and the underlying dependencies of the source code (i.e., ripple effects between classes). We note that despite the involvement of source code artifacts in the calculation of R2EM, this metric is considered as a requirements’ level one, since the unit of analysis is pairs of software requirements. To validate the proposed metric, we conducted an industrial case study, on two enterprise applications of an SME. The study design involved both quantitative and qualitative data, and input was given by 9 practitioners. The results suggested that R2EM is able to identify ripple effects between requirements at a satisfactory level, and those effects are mostly caused by overlapping implementations and source code ripple effects of these implementations.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3


  1. We note that we cannot claim that this list is exhaustive. However, we have not identified any other type of dependency from the case study. Nevertheless, other types of dependencies may exist, so we have added a relevant threat to validity.

  2. We note that for REM, we refer to the frequency of past changes through PCCC, whereas for R2EM through PCRC.


  4. The way the requirements are selected is discussed in Sect. 4.2.


  • Ali, N., Jaafar, F., & Hassan, A. E. (2013). Leveraging historical co-change information for requirements traceability. 20th Working Conference on Reverse Engineering (WCRE’ 13). Germany.

  • Ampatzoglou, A., Chatzigergiou, A., Charalampidou, S., & Avgeriou, P. (2015). The Effect of GoF Design Patterns on Stability: A Case Study. Transactions on Software Engineering, 41(8), 781–802.

    Article  Google Scholar 

  • Antoniol, G., Canfora, G., Casazza, G., & De Lucia, A. (2000). Identifying the starting impact set of a maintenance request: a case study. 4th European Conference on Software Maintenance and Reengineering. Zurich. Switzerland.

  • Arora, C., Sabetzadeh, M., Goknil, A., Briand, L. C., & Zimmer, F. (2015). Change impact analysis for Natural Language requirements: An NLP approach. 23rd International Requirements Engineering Conference (RE). Ottawa.

  • Arvanitou, E. M., Ampatzoglou, A., Chatzigeorgiou, A., & Avgeriou, P. (2015). Introducing a ripple effect measure: a theoretical and empirical validation. 9th International Symposium on Empirical Software Engineering and Measurement (ESEM ‘15). China.

  • Arvanitou, E. M., Ampatzoglou, A., Chatzigeorgiou, A., & Avgeriou, P. (2017). A Method for Assessing Class Change Proneness. 21st International Conference on Evaluation and Assessment in Software Engineering (EASE ’17). ACM, Sweden.

  • Arvanitou, E. M., Ampatzoglou, A., Tzouvalidis, K., Chatzigeorgiou, A., Avgeriou, P., & Deligiannis, I. (2017). Assessing Change Proneness at the Architecture Level: An Empirical Validation. 1st International Workshop on Emerging Trends in Software Design and Architecture (WETSoDA ’17). Nanjing, China.

  • Basso, F. P., Pillat, R. M., Oliveira, T. C., Roos-Frantz, F., & Frantz, R. Z. (2016). Automated design of multi-layered web information systems. Journal of Systems and Software., 117, 612–637.

    Article  Google Scholar 

  • Beck, K., & Cunningham, W. (1989). A laboratory for teaching object oriented thinking. Conference proceedings on Object-oriented programming systems, languages and applications (OOPSLA '89), 1–6. USA.

  • Buse, R. P. L., & Weimer, W. R. (2010). Automatically documenting program changes. International conference on Automated software engineering (ASE '10), 33–42. Belgium.

  • Charalampidou, S., Ampatzoglou, A., Karountzos, E., & Avgeriou, P. (2020). Empirical studies on software traceability: A mapping study. Journal of Software: Evolution and Process, 32 (11). Wiley and Sons.

  • Chen, J. -C., & Huang, S. -J. (2009). An empirical analysis of the impact of software development problem factors on software maintainability. Journal of Systems and Software., 82(6), 981–992.

    Article  Google Scholar 

  • Conejero, J. M., Figueiredo, E., Garcia, A., Hernández, J., & Jurado, E. (2012). On the relationship of concern metrics and requirements maintainability. Information and Software Technology, 54(2), 212–238.

    Article  Google Scholar 

  • Dahlstedt, A. G., & Persson, A. (2005). Requirements Interdependencies: State of the Art and Future Challenges. Engineering and Managing Software Requirements. Springer. pp 95–116.

  • Field, A. (2013). Discovering Statistics using IBM SPSS Statistics. SAGE Ltd.

  • Fowler, M. (2003). UML Distilled: A Brief Guide to the Standard Object Modeling Language. Addison-Wesley Professional. 3rd Edition.

  • Galorath, D. D. (2008). Software total ownership costs: development is only job one. Software Tech News, 11(3).

  • Goknil, A., Kurtev, I., & van den Berg, K. (2008). Change impact analysis based on formalization of trace relations for requirements. 4th ECMFA Traceability Workshop.

  • Goknil, A., Kurtev, I., van den Berg, K., & Spijkerman, W. (2014). Change impact analysis for requirements: A metamodeling approach. Information and Software Technology., 56(8), 950–972.

    Article  Google Scholar 

  • González-Aparicio, M. T., Younas, M., Tuya, J., & Casado, R. (2016). A New Model for Testing CRUD Operations in a NoSQL Database. 30th International Conference on Advanced Information Networking and Applications (AINA), Crans-Montana.

  • Hassine, J., Rilling, J., Hewitt, J., & Dssouli, R. (2005). Change impact analysis for requirement evolution using use case maps. 8th International Workshop on Principles of Software Evolution (IWPSE'05). Lisbon. Portugal.

  • ISO, IEC 9126–1. (2001). Software engineering - Product quality (Part 1: Quality model) Switzerland.

  • Kagdi, H., Maletic, J., & Sharif, B. (2009). Mining software repositories for traceability links. 15th International Conference on Program Comprehension (ICPC ’07), 145 –154.

  • Kaur, K., & Rani, R. (2015). Managing Data in Healthcare Information Systems: Many Models, One Solution. Computer, 48(3), 52–59.

    Article  Google Scholar 

  • Kitchenham, B., & Pfleeger, S. L. (1996). Software quality: The elusive target. IEEE Software, 13(1), 12–21.

    Article  Google Scholar 

  • Krestou, M., Arvanitou, E. M., Ampatzoglou, A., Deligiannis, I., & Gerogiannis, V. (2021). Change impact analysis: A systematic mapping study. Journal of Systems and Software, 173.

  • Marg, L., Luri, L. C., O’Curran, E., & Mallett, A. (2014). Rating Evaluation Methods through Correlation. 1st Workshop on Automatic and Manual Metrics for Operational Translation Evaluation (MTE ’14). Reykjavik, Iceland.

  • Nejati, S., Sabetzadeh, M., Arora, C., Briand, L. C., & Mandoux, F. (2016). Automated Change Impact Analysis between SysML Models of Requirements and Design. 24th International Symposium on Foundations of Software Engineering (FSE’16), Seattle, USA.

  • Queille, J. -P., Voidrot, J. -F., Wilde, N., & Munro, M. (1994). The impact analysis task in software maintenance: A model and a case study. International Conference on Software Maintenance, Victoria, Canada.

  • Rahman, M. A., Razali, R., & Singh, D. (2014). A risk model of requirements change impact analysis. Journal of Software., 9(1), 76–81.

    Article  Google Scholar 

  • Runeson, P., Höst, M., Rainer, A., & Regnell, B. (2009). Case study research in software engineering: Guidelines and examples. John Wiley and Sons. Inc.

  • Spinellis, D., Gousios, G., Karakoidas, V., Louridas, P., Adams, P. J., Samoladas, I., & Stamelos, I. (2009). Evaluating the Quality of Open Source Software. Electronic Notes in Theoretical Computer Science (ENTCS), 233(3), 5–28.

  • Standard for a Software Quality Metrics Methodology. (2009). IEEE Standards. IEEE Computer Society, 1061–1998. Reaffirmed Dec. 2009.

  • Truica, C., Radulescu, F., Boicea, A., & Bucur, I. (2015). Performance Evaluation for CRUD Operations in Asynchronously Replicated Document Oriented Database. 20th International Conference on Control Systems and Computer Science. Bucharest.

  • Zhang, H., Li, J., Zhu, L., Jeffery, R., Liu, Y., Wang, Q., & Li, M. (2014). Investigating dependencies in software requirements for change propagation analysis. Information and Software Technology. Elsevier., 56(1), 40–53.

    Article  Google Scholar 

Download references


This work was financially supported by the action “Strengthening Human Resources Research Potential via Doctorate Research” of the Operational Program “Human Resources Development Program, Education and Lifelong Learning, 2014–2020”, implemented from State Scholarship Foundation (IKY) and co-financed by the European Social Fund and the Greek public (National Strategic Reference Framework (NSRF) 2014–2020).

Author information

Authors and Affiliations


Corresponding author

Correspondence to Apostolos Ampatzoglou.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: case study material

Appendix: case study material

Case study time plan

figure f


The questionnaire will be structured as follows for all selected requirements: Bill Read (YDATA-BR), Alert Create (YDATA-LC), Statement Create (YDATA-SC), Payment Create (YDATA-PC), Citizen Create (CR-CC), Birth Create (CR-BC), Marriage Update (CR-MU), and Name-giving Create (CR-NC).

figure g
figure h
figure i
figure j
figure k
figure l
figure m
figure n

Focus group questions

  1. 1)

    Do you think that requirements, which deal with the same entity are probable to co-change due to the ripple effect?

    1. a.

      CR-CU with CR-CC (high)

    2. b.

      CR-MU with CR-MR (medium)

    3. c.

      CR-PD with CR -PR (low)

  2. 2)

    Do you think that requirements, which perform the same action are probable to co-change due to the ripple effect?

    1. a.

      CR-MC with CR-CC (high)

    2. b.

      CR-CU with CR-NU (medium)

    3. c.

      CR-BD with CR-MD (low)

  3. 3)

    Do you think that requirements, which deal with the same entity are probable to co-change due to the ripple effect?

    1. a.

      YDATA-BU with YDATA-BR(high)

    2. b.

      YDATA-SR with YDATA-SD (medium)

    3. c.

      YDATA-HU with YDATA-HD (low)

  4. 4)

    Do you think that requirements, which perform the same action on different entities are probable to co-change due to the ripple effect?

    1. a.

      YDATA-HR with YDATA-AR (high)

    2. b.

      YDATA-UC with YDATA-PC (medium)

    3. c.

      YDATA-LR with YDATA-PR (low)

  5. 5)

    Which of the aforementioned classes are the most important?

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Arvanitou, EM., Ampatzoglou, A., Chatzigeorgiou, A. et al. A metric for quantifying the ripple effects among requirements. Software Qual J 30, 853–883 (2022).

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Metrics
  • Change impact analysis
  • Requirements
  • Maintenance