Empirical Software Engineering

, Volume 19, Issue 3, pp 419–464 | Cite as

Examination of the software architecture change characterization scheme using three empirical studies

  • Byron J. WilliamsEmail author
  • Jeffrey C. Carver


Software maintenance is one of the most crucial aspects of software development. Software engineering researchers must develop practical solutions to handle the challenges presented in maintaining mature software systems. Research that addresses practical means of mitigating the risks involved when changing software, reducing the complexity of mature software systems, and eliminating the introduction of preventable bugs is paramount to today’s software engineering discipline. The Software Architecture Change Characterization Scheme (SACCS) provides software maintainers with a systematic approach to analyzing and characterizing the impact of a change prior to its implementation. SACCS was designed to help novice developers understand change requests, facilitate discussion among developers, and provide a higher-quality change compared with an ad hoc approach. In addition, this paper describes three controlled experiments designed to assess the viability of using SACCS and its ability to fulfill its goals. The successive studies build upon each other to enable progressive insights into the viability of the scheme. The results indicate that SACCS: 1) provides insight into the difficulty of a change request by assisting novice developers to consider various aspects of the request’s potential to impact the system, 2) helps to facilitate discussion among developers by providing a common tool for change assessment, and 3) is a useful tool for supporting change implementation. The three experiments provide insight into the usefulness of SACCS, motivate additional research questions, and serve as a baseline for moving forward with research and further development of the approach.


Software architecture Change characterization Software changes Software maintenance Empirical studies 



The authors thank the Empirical Software Engineering Research group at Mississippi State University and the Fraunhofer Center for Experimental Software Engineering in College Park, MD for feedback on the research and use of the TSAFE program. The authors also thank the study participants. This research was funded by NSF Grant CCF-0438923.


  1. Basili V, Caldiera G, Rombach HD (1994) The Goal Question Metric Paradigm. In: Marciniak JJ (ed) Encyclopedia of Software Engineering. Wiley, New York, pp 528–532Google Scholar
  2. Batory D (2005) Feature models, grammars, and propositional formulas. Paper presented at the Proceedings of the 9th international conference on Software Product Lines, Rennes, FranceGoogle Scholar
  3. Belady LA, Lehman MM (1976) A Model of Large Program Development. IBM Syst J 15(1):225–252CrossRefzbMATHGoogle Scholar
  4. Bosch J (2000) Design and Use of Software Architectures. Addison WesleyGoogle Scholar
  5. Briand LC, Labiche Y, O'Sullivan L (2003) Impact analysis and change management of UML models. In: Proceedings of the International Conference on Software Maintenance, pp 256–265Google Scholar
  6. Briand LC, Labiche Y, O’Sullivan L, Sûwka MM (2006) Automated impact analysis of UML models. J Syst Softw 79(3):339–352CrossRefGoogle Scholar
  7. Brooks F (1975) The Mythical Man-Month. Addison-WesleyGoogle Scholar
  8. Carver J, Jaccheri L, Morasca S, Shull F (2003) Issues in using students in empirical studies in software engineering education. In: Proceedings of the Ninth International Software Metrics Symposium, pp 239–249Google Scholar
  9. Chaumun MA, Kabaili H, Keller RK, Lustman F (2002) A change impact model for changeability assessment in object-oriented software systems. Sci Comput Program 45(2):155–177CrossRefzbMATHGoogle Scholar
  10. Chesley OC, Ren X, Ryder BG (2005) Crisp: a debugging tool for Java programs. In, pp 401–410Google Scholar
  11. Clements P, Bachmann F, Bass L, Garlan D, Ivers J, Little R, Merson P, Nord R, Stafford J (2010) Documenting Software Architectures: Views and Beyond. Addison-Wesley ProfessionalGoogle Scholar
  12. Cook S, He J, Harrison R (2001) Dynamic and static views of software evolution. Software Maintenance, 2001 Proceedings IEEE International Conference on, pp 592–601Google Scholar
  13. Dennis G (2003) TSAFE: Building a Trusted Computing Base for Air Traffic Control Software. Dissertation, Massachusetts Institue of TechnologyGoogle Scholar
  14. Eick SG, Graves TL, Karr AF, Marron JS, Mockus A (2001) Does code decay? Assessing the evidence from change management data. IEEE Trans Softw Eng 27(1):1–12CrossRefGoogle Scholar
  15. Ferzund J, Ahsan SN, Wotawa F (2009) Software change classification using hunk metrics. In: Software Maintenance, 2009. ICSM 2009. IEEE International Conference on, 20–26 Sept. 2009. pp 471–474. doi: 10.1109/icsm.2009.5306274
  16. Fluri B, Gall HC (2006) Classifying Change Types for Qualifying Change Couplings. In: Proceedings of the 14th IEEE Conference on Program Comprehension, Athens, Greece, pp 35–45Google Scholar
  17. Giroux O, Robillard MP (2006) Detecting increases in feature coupling using regression tests. Paper presented at the Proceedings of the 14th ACM SIGSOFT international symposium on Foundations of software engineering, Portland, Oregon, USAGoogle Scholar
  18. Godfrey MW, German DM (2008) The past, present, and future of software evolution. In: Frontiers of Software Maintenance, pp 129–138Google Scholar
  19. Godfrey MW, Tu Q (2000) Evolution in open source software: A case study. Proceedings of the International Conference on Software Maintenance, pp 131–142Google Scholar
  20. Graves TL, Mockus A (1998) Inferring change effort from configuration management databases. In: Software Metrics Symposium, 1998. Metrics 1998. Proceedings. Fifth International, pp 267–273Google Scholar
  21. Herraiz I, Robles G, Gonzalez-Barahona JM, Capiluppi A, Ramil JF (2006) Comparison between SLOCs and number of files as size metrics for software evolution analysis. In: Proceedings of the 10th European Conference on Software Maintenance and Reengineering, pp 203–210Google Scholar
  22. Hochstein L, Lindvall M (2005) Combating architectural degeneration: a survey. Inf Softw Technol 47(10):643–656CrossRefGoogle Scholar
  23. IEEE standard glossary of software engineering terminology (1990) IEEE Std 61012-1990Google Scholar
  24. Kang K, Cohen S, Hess J, Nowak W, Peterson S (1990) Feature-oriented domain analysis (FODA) feasibility study. Carnegie Mellon University, Software Engineering InstituteGoogle Scholar
  25. Kim S, Whitehead EJ, Bevan J (2005) Analysis of signature change patterns. In: Proceedings of the 2005 international workshop on Mining software repositories, St. Louis, Missouri. ACM Press. doi: 10.1145/1083142.1083154
  26. Kung D, Gao J, Hsia P, Wen F, Toyoshima Y, Chen C (1994) Change impact identification in object oriented software maintenance. In: Proceedings of the International Conference onSoftware Maintenance, Victoria, BC, pp 202–211Google Scholar
  27. Lehman MM (1980) Programs, life cycles, and laws of software evolution. Proc IEEE 68(9):1060–1076CrossRefGoogle Scholar
  28. Lehman MM (1996) Feedback, evolution and software technology. In: Proceedings of the 10th International Process Support of Software Product Lines Software Process Workshop, pp 101–103Google Scholar
  29. Lehman MM, Belady L (1985) Program Evolution - Processes of Software Change. Academic, LondonGoogle Scholar
  30. Lehman MM, Perry DE, Ramil JF (1998a) Implications of evolution metrics on software maintenance. In: Proceedings of the International Conference on Software Maintenance, Bethesda, MD, pp 208–217Google Scholar
  31. Lehman MM, Perry DE, Ramil JF (1998b) On evidence supporting the FEAST hypothesis and the laws of software evolution. In: Proceedings of the Fifth International Software Metrics Symposium, pp 84–88 Google Scholar
  32. Li PL, Shaw M, Herbsleb J, Ray B, Santhanam P (2004) Empirical evaluation of defect projection models for widely-deployed production software systems. SIGSOFT Softw Eng Notes 29(6):263–272, 10.1145/1041685.1029930 CrossRefGoogle Scholar
  33. Lientz B, Swanson B (1980) Software Maintenance Management Addison-WesleyGoogle Scholar
  34. Lindvall M, Tesoriero R, Costa P (2002) Avoiding architectural degeneration: an evaluation process for software architecture. In: Proceedings of the Eighth IEEE Symposium on Software Metrics, pp 77–86Google Scholar
  35. Mohagheghi P, Conradi R (2004) An empirical study of software change: origin, acceptance rate, and functionality vs. quality attributes. In: Proceedings of the 2004 International Symposium on Empirical Software Engineering (ISESE ‘04), pp 7–16Google Scholar
  36. Nedstam J, Karlsson EA, Host M (2004) The architectural change process. In: Proceedings of the 2004 International Symposium on Empirical Software Engineering (ISESE ‘04), pp 27–36Google Scholar
  37. Nurmuliani N, Zowghi D, Williams SP (2004) Using card sorting technique to classify requirements change. In: Proceedings of the 12th IEEE International Requirements Engineering Conference, pp 240–248Google Scholar
  38. Ostrand TJ, Weyuker EJ, Bell RM (2007) Automating algorithms for the identification of fault-prone files. Paper presented at the Proceedings of the 2007 international symposium on Software testing and analysis, London, United KingdomGoogle Scholar
  39. Parnas DL (1994) Software Aging. In: Proceedings of the 16th International Conference on Software Engineering, Sorrento, Italy, pp 279–287Google Scholar
  40. Raja U, Barry E (2005) Investigating quality in large-scale Open Source Software. Paper presented at the Proceedings of the fifth workshop on Open source software engineering, St. Louis, MissouriGoogle Scholar
  41. Ren X, Shah F, Tip F, Ryder BG, Chesley O (2004a) Chianti: a tool for change impact analysis of java programs. Paper presented at the Proceedings of the 19th annual ACM SIGPLAN conference on Object-oriented programming, systems, languages, and applications, Vancouver, BC, CanadaGoogle Scholar
  42. Ren X, Shah F, Tip F, Ryder BG, Chesley O (2004b) Chianti: a tool for change impact analysis of java programs. In: Proceedings of the 19th annual ACM SIGPLAN Conference on Object-oriented programming, systems, languages, and applications. ACM Press, Vancouver, BC, Canada, pp 432–448Google Scholar
  43. Sommerville I (2004) Software Engineering, 7th edn. Addison-WesleyGoogle Scholar
  44. Van Rysselberghe F, Demeyer S (2004) Mining Version Control Systems for FACs (Frequently Applied Changes). In: 26th International Conference on Software Engineering, Edinburgh, Scotland, pp 48–52Google Scholar
  45. Williams B (2006) A Framework for Assessing the Impact of Software Changes to Software Architecture Using Change Classification. Master's Thesis, Mississippi State University, Starkville, MSGoogle Scholar
  46. Williams B (2009) Change Decision Support: Extraction and Analysis of Late Architecture Chagnes Using Change Characterization and Software Metrics. Doctoral Dissertation, Mississippi State University, Starkville, MSGoogle Scholar
  47. Williams B, Carver J (2007) Characterizing Software Architecture Changes: An Initial Study. In: Proceedings of the First International Conference on Empirical Software Engineering and Measurement, Madrid, Spain, pp 410–419Google Scholar
  48. Williams B, Carver J (2010) Characterizing Software Architecture Changes: A Systematic Review. Inf Softw Technol 52(1):31–51CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringMississippi State UniversityStarkvilleUSA
  2. 2.Department of Computer ScienceUniversity of AlabamaTuscaloosaUSA

Personalised recommendations