Empirical Software Engineering

, Volume 22, Issue 2, pp 631–669 | Cite as

Do Programmers do Change Impact Analysis in Debugging?

Article

Abstract

“Change Impact Analysis” is the process of determining the consequences of a modification to software. In theory, change impact analysis should be done during software maintenance, to make sure changes do not introduce new bugs. Many approaches and techniques are proposed to help programmers do change impact analysis automatically. However, it is still an open question whether and how programmers do change impact analysis. In this paper, we conducted two studies, one in-depth study and one breadth study. For the in-depth study, we recorded videos of nine professional programmers repairing two bugs for two hours. For the breadth study, we surveyed 35 professional programmers using an online system. We found that the programmers in our studies did static change impact analysis before they made changes by using IDE navigational functionalities, and they did dynamic change impact analysis after they made changes by running the programs. We also found that they did not use any change impact analysis tools.

Keywords

Change impact analysis Program debugging Empirical software engineering Software maintenance Programmer navigation 

References

  1. Acharya M, Robinson B (2011) Practical change impact analysis based on static program slicing for industrial software systems. In: Proceedings of the 33rd international conference on software engineering, ICSE ’11. ACM, New York, pp 746–755Google Scholar
  2. Altmann J (1974) Observational study of behavior: sampling methods. Behaviour:227–267Google Scholar
  3. Apiwattanapong T, Orso A, Harrold MJ (2005) Efficient and precise dynamic impact analysis using execute-after sequences. In: Proceedings of the 27th international conference on software engineering, 10. ACM, New York, pp 432–441Google Scholar
  4. Bassil S, Keller R (2001) Software visualization tools: survey and analysis. In: Proceedings of 9th international workshop on program comprehension, 2001. IWPC 2001, pp 7–17Google Scholar
  5. Beszédes A, Gergely T, Farago S, Gyimothy T, Fischer F (2007) The dynamic function coupling metric and its use in software evolution. In: 11th European conference on software maintenance and reengineering, 2007, pp 103–112Google Scholar
  6. Bohner SA (1996) Software change impact analysis. IEEE Computer Society Press, Los AlamitosGoogle Scholar
  7. Brandt J, Guo PJ, Lewenstein J, Dontcheva M, Klemmer SR (2009) Two studies of opportunistic programming: interleaving web foraging, learning, and writing code. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 1589–1598Google Scholar
  8. Brandt J, Guo PJ, Lewenstein J, Klemmer SR (2008) Opportunistic programming: how rapid ideation and prototyping occur in practice. In: Proceedings of the 4th international workshop on end-user software engineering. ACM, New York, pp 1–5Google Scholar
  9. Breech B, Tegtmeyer M, Pollock L (2006) Integrating influence mechanisms into impact analysis for increased precision. In: 22nd IEEE international conference on software maintenance, 2006, pp 55–65Google Scholar
  10. Briand LC, Wust J, Lounis H (1999) Using coupling measurement for impact analysis in object-oriented systems. In: Proceedings of the 1999 IEEE international conference on software maintenance. IEEE, pp 475–482Google Scholar
  11. Brooks R (1983) Towards a theory of the comprehension of computer programs. Int J Man Mach Stud 18(6):543–554CrossRefGoogle Scholar
  12. Canfora G, Cerulo L (2006) Fine grained indexing of software repositories to support impact analysis. In: Proceedings of the 7th international workshop on mining software repositories. ACM, New York, pp 105–111Google Scholar
  13. Dam HK, Ghose A (2011) Automated change impact analysis for agent systems. In: 27th IEEE international conference on software maintenance (ICSM), 2011, pp 33–42Google Scholar
  14. de Souza CRB, Redmiles DF (2008) An empirical study of software developers’ management of dependencies and changes. In: Proceedings of the 30th international conference on software engineering, pp 241–250Google Scholar
  15. Dorn B, Guzdial M (2010) Learning on the job: characterizing the programming knowledge and learning strategies of web designers. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 703–712Google Scholar
  16. Eaddy M, Zimmermann T, Sherwood KD, Garg V, Murphy GC, Nagappan N, Aho AV (2008) Do crosscutting concerns cause defects. IEEE Trans Softw Eng 34(4):497–515CrossRefGoogle Scholar
  17. Fjeldstad RK, Hamlen WT (1983) Application program maintenance study: report to our respondents. In: Proceedings of GUIDE 48Google Scholar
  18. Gethers M, Dit B, Kagdi H, Poshyvanyk D (2012) Integrated impact analysis for managing software changes. In: 34th international conference on software engineering, 2012, pp 430–440Google Scholar
  19. Hartmann B, Doorley S, Klemmer SR (2008) Hacking, mashing, gluing: understanding opportunistic design. IEEE Pervasive Comput 7(3):46–54CrossRefGoogle Scholar
  20. Hattori L, Guerrero D, Figueiredo J, Brunet J, Damasio J (2008) On the precision and accuracy of impact analysis techniques. In: 7th IEEE/ACIS international conference on computer and information science, pp 513–518Google Scholar
  21. Huo D, Ding T, McMillan C, Gethers M (2014) An empirical study of the effects of expert knowledge on bug reports. In: IEEE international conference on software maintenance and evolution, pp 157–166Google Scholar
  22. Kagdi H, Gethers M, Poshyvanyk D (2013) Integrating conceptual and logical couplings for change impact analysis in software. Empir Softw Eng 18(5):933–969CrossRefGoogle Scholar
  23. Kienle H, Müller H (2007) Requirements of software visualization tools: a literature survey. In: 4th IEEE international workshop on visualizing software for understanding and analysis, 2007. VISSOFT 2007, pp 2–9Google Scholar
  24. Ko A, Myers B, Coblenz M, Aung H (2006) An exploratory study of how developers seek, relate, and collect relevant information during software maintenance tasks. IEEE Trans Softw Eng 32(12):971–987CrossRefGoogle Scholar
  25. Lakhotia A (1993) Understanding someone else’s code: analysis of experiences. J Syst Softw 23(3):269–275MathSciNetCrossRefGoogle Scholar
  26. LaToza TD, Myers BA (2010) Developers ask reachability questions. In: Proceedings of the 32nd ACM/IEEE international conference on software engineering, vol 1. ACM, New York, pp 185–194Google Scholar
  27. LaToza TD, Venolia G, DeLine R (2006) Maintaining mental models: a study of developer work habits. In: Proceedings of the 28th international conference on software engineering. ACM, New York, pp 492–501Google Scholar
  28. LaToza TD, Garlan D, Herbsleb JD, Myers BA (2007) Program comprehension as fact finding. In: Proceedings of the the 6th joint meeting of the european software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering, ESEC-FSE ’07. ACM, New York, pp 361–370Google Scholar
  29. Law J, Rothermel G (2003) Whole program path-based dynamic impact analysis. In: Proceedings of the 2003 international conference on software engineering. IEEE, pp 308–318Google Scholar
  30. Lawrance J, Bogart C, Burnett M, Bellamy R, Rector K, Fleming S (2013) How programmers debug, revisited: an information foraging theory perspective. IEEE Trans Softw Eng 39(2):197–215CrossRefGoogle Scholar
  31. Lehnert S (2011) A taxonomy for software change impact analysis. In: Proceedings of the 12th international workshop on principles of software evolution and the 7th annual ERCIM workshop on software evolution, IWPSE-EVOL ’11. ACM, New York, pp 41–50Google Scholar
  32. Li B, Sun X, Leung H, Zhang S (2013) A survey of code-based change impact analysis techniques. Verification Reliab Softw Test 23(8):613–646CrossRefGoogle Scholar
  33. McDonald JH (2009) Handbook of biological statistics, vol 2. Sparky House Publishing, BaltimoreGoogle Scholar
  34. McIntosh S, Adams B, Nagappan M, Hassan A (2014) Mining co-change information to understand when build changes are necessary. In: IEEE international conference on software maintenance and evolution (ICSME), 2014, pp 241–250Google Scholar
  35. McMillan C, Grechanik M, Poshyvanyk D, Xie Q, Fu C (2011) Portfolio: finding relevant functions and their usage. In: 2011 33rd international conference on software engineering. IEEE, pp 111–120Google Scholar
  36. Orso A, Apiwattanapong T, Harrold MJ (2003) Leveraging field data for impact analysis and regression testing. In: Proceedings of the 9th European software engineering conference held jointly with 11th ACM SIGSOFT international symposium on foundations of software engineering. ACM, New York, pp 128–137Google Scholar
  37. Parande M, Koru G (2010) A longitudinal analysis of the dependency concentration in smaller modules for open-source software products. In: IEEE international conference on software maintenance (ICSM), 2010, pp 1–5Google Scholar
  38. Petrenko M, Rajlich V (2009) Variable granularity for improving precision of impact analysis. In: IEEE 17th international conference on program comprehension, 2009, pp 10–19Google Scholar
  39. Poshyvanyk D, Marcus A, Ferenc R, Gyimóthy T (2009) Using information retrieval based coupling measures for impact analysis. Empir Softw Eng 14(1):5–32CrossRefGoogle Scholar
  40. Ren X, Shah F, Tip F, Ryder BG, Chesley O (2004) Chianti: a tool for change impact analysis of java programs. In: ACM sigplan notices, vol 39. ACM, pp 432–448Google Scholar
  41. Robillard M, Coelho W, Murphy G (2004) How effective developers investigate source code: an exploratory study. IEEE Trans Softw Eng 30(12):889–903CrossRefGoogle Scholar
  42. Roehm T, Tiarks R, Koschke R, Maalej W (2012) How do professional developers comprehend software?. In: Proceedings of the 34th international conference on software engineering. IEEE Press, Piscataway, pp 255–265Google Scholar
  43. Rovegård P, Angelis L, Wohlin C (2008) An empirical study on views of importance of change impact analysis issues. IEEE Trans Softw Eng 34(4):516–530CrossRefGoogle Scholar
  44. Runeson P, Höst M (2009) Guidelines for conducting and reporting case study research in software engineering. Empirical Softw. Empir Softw Eng 14(2):131–164CrossRefGoogle Scholar
  45. Rungta N, Person S, Branchaud J (2012) A change impact analysis to characterize evolving program behaviors. In: 28th IEEE international conference on software maintenance (ICSM), 2012, pp 109–118Google Scholar
  46. Shneiderman B (1980) Software psychology: human factors in computer and information systems (Winthrop Computer Systems Series). Winthrop PublishersGoogle Scholar
  47. Sillito J, Murphy G, De Volder K (2008) Asking and answering questions during a programming change task. IEEE Trans Softw Eng 34(4):434–451CrossRefGoogle Scholar
  48. Storey MA (2006) Theories, tools and research methods in program comprehension: past, present and future. Softw Qual J 14(3):187–208CrossRefGoogle Scholar
  49. Storey MA, Wong K, Muller H (1997) How do program understanding tools affect how programmers understand programs?. In: Proceedings of the 4th working conference on reverse engineering, 1997, pp 12–21Google Scholar
  50. Tao Y, Dang Y, Xie T, Zhang D, Kim S (2012) How do software engineers understand code changes?: an exploratory study in industry. In: Proceedings of the ACM SIGSOFT 20th international symposium on the foundations of software engineering, FSE ’12. ACM, New York, pp 51:1–51:11Google Scholar
  51. Wetzlmaier T, Ramler R (2015) Improving manual change impact analysis with tool support: a study in an industrial project. In: Software quality. Software and systems quality in distributed and mobile environments, lecture notes in business information processing, vol 200. Springer International Publishing, pp 47–66Google Scholar
  52. Wilkerson J (2012) A software change impact analysis taxonomy. In: 28th IEEE international conference on software maintenance (ICSM), 2012, pp 625–628Google Scholar
  53. Wu Y, Yap R, Ramnath R (2010) Comprehending module dependencies and sharing. In: ACM/IEEE 32nd international conference on software engineering, 2010, vol 2, pp 89–98Google Scholar
  54. Ye X, Bunescu R, Liu C (2014) Learning to rank relevant files for bug reports using domain knowledge. In: Proceedings of the 22Nd ACM SIGSOFT international symposium on foundations of software engineering, FSE 2014. ACM, New York, pp 689–699Google Scholar
  55. Zeller A (2009) Why programs fail: a guide to systematic debugging. ElsevierGoogle Scholar
  56. Zimmermann T, Zeller A, Weissgerber P, Diehl S (2005) Mining version histories to guide software changes. IEEE Trans Softw Eng 31(6):429–445CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Siyuan Jiang
    • 1
  • Collin McMillan
    • 1
  • Raul Santelices
    • 2
  1. 1.Department of Computer Science and EngineeringUniversity of Notre DameNotre DameUSA
  2. 2.Dell SecureWorksAtlantaUSA

Personalised recommendations