Advertisement

Large-Scale Enterprise Systems: Changes and Impacts

  • Wen Chen
  • Asif Iqbal
  • Akbar Abdrakhmanov
  • Jay Parlar
  • Chris George
  • Mark Lawford
  • Tom Maibaum
  • Alan Wassyng
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 141)

Abstract

Changes and their impacts to large-scale enterprise systems are critical and hard to identify and calculate. This work focuses on analysing changes and their potential impacts, and in particular on how regression testing following such changes can be minimised. The target scope of the approach we describe here is systems containing hundreds of thousands of classes and millions of methods. It is extremely difficult and costly to apply regular regression testing techniques to such systems. It is very expensive and often unnecessary to retest everything after a change is introduced. Selective retesting is dangerous if the impacts of change are not understood, and analysing such systems to understand what is being changed and what the impacts are is difficult. This paper proposes a way to perform a change impact analysis which makes it possible to do efficient, targeted regression testing of enterprise systems. Our approach has been tried on a large system comprising 4.6 million methods with 10 million dependencies between them. Using our approach, maintainers can focus on a smaller, relevant subset of their test suites instead of doing testing blindly. We include a case study that illustrates the savings that can be attained.

Keywords

Large-scale enterprise systems Impact analysis Static analysis Dependency graph 

Notes

Acknowledgments

The authors are grateful to Ron Mison, Sundaram Viswanathan, and Vinayak Viswanathan of Legacy Systems International, for introducing us to the problem, and also for working with us to ensure that our research produces practical methods and tools. We also thank John Hatcliff for his advice and for pointing us to relevant work, and Wolfram Kahl for his technical advice throughout the project.

References

  1. 1.
    IT Key Metrics Data 2012. Gartner, Inc., December 2011Google Scholar
  2. 2.
    Bohner, S.A.: Software change impact analysis. In: Proceedings of the 27th Annual NASA Goddard/IEEE Software Engineering, Workshop (SEW-27’02) (1996)Google Scholar
  3. 3.
    Ren, X., Shah, F., Tip, F., Ryder, B.G., Chesley, O.: Chianti: a tool for change impact analysis of java programs. SIGPLAN Not. 39, 432–448 (2004)CrossRefGoogle Scholar
  4. 4.
    Pfleeger, S., Atlee, J.: Software Engineering: Theory and Practice. Prentice Hall, Englewood Cliffs (2006)Google Scholar
  5. 5.
    Ayewah, N., Hovemeyer, D., Morgenthaler, J., Penix, J., Pugh, W.: Using static analysis to find bugs. IEEE Softw. 25, 22–29 (2008)CrossRefGoogle Scholar
  6. 6.
    Khare, S., Saraswat, S., Kumar, S.: Static program analysis of large embedded code base: an experience. In: Proceedings of the 4th India Software Engineering Conference, ISEC ’11, (New York, NY, USA), pp. 99–102. ACM (2011)Google Scholar
  7. 7.
    Apiwattanapong, T.: Efficient and precise dynamic impact analysis using execute-after sequences. In: Proceedings of the 27th International Conference on Software Engineering (2005)Google Scholar
  8. 8.
    Orso, A., Apiwattanapong, T., Harrold, M.J.: Leveraging field data for impact analysis and regression testing. In: Proceedings of the 9th European Software Engineering Conference held jointly with 11th ACM SIGSOFT International Symposium on Foundations of Software Engineering, vol. 28(5), September 2003Google Scholar
  9. 9.
    Breech, B., Danalis, A., Shindo, S., Pollock, L.: Online impact analysis via dynamic compilation technology. In: 20th IEEE International Conference on Software Maintenance (2004)Google Scholar
  10. 10.
    Patel, C., Hamou-Lhadj, A., Rilling, J.: Software clustering using dynamic analysis and static dependencies. In: 13th European Conference on Software Maintenance and Reengineering, 2009. CSMR ’09, pp. 27–36. March 2009Google Scholar
  11. 11.
    Li, H.: Dynamic analysis of object-oriented software complexity. In: 2012 2nd International Conference on Consumer Electronics, Communications and Networks (CECNet), pp. 1791–1794, April 2012Google Scholar
  12. 12.
    Law J., Rothermel, G.: Incremental dynamic impact analysis for evolving software systems., In: Proceedings of the 14th International Symposium on Software, Reliability Engineering (2003)Google Scholar
  13. 13.
    Maia, M.C.O., Bittencourt, R.A., de Figueiredo, J.C.A., Guerrero, D.D.S.: The hybrid technique for object-oriented software change impact analysis. In: European Conference on Software Maintenance and Reengineering, pp. 252–255 (2010)Google Scholar
  14. 14.
    Ferrante, J., Ottenstein, K.J., Warren, J.D.: The program dependence graph and its use in optimization. ACM Trans. Program. Lang. Syst. 9, 319–349 (1987)CrossRefGoogle Scholar
  15. 15.
    Ottenstein, K.J., Ottenstein, L.M.: The program dependence graph in a software development environment. SIGPLAN Not. 19, 177–184 (1984)CrossRefGoogle Scholar
  16. 16.
    Rothermel, G., Harrold, M.: Analyzing regression test selection techniques. IEEE Trans. Softw. Eng. 22, 529–551 (1996)CrossRefGoogle Scholar
  17. 17.
    Apiwattanapong, T., Orso, A., Harrold, M.: A differencing algorithm for object-oriented programs. In: Proceedings of the 19th International Conference on Automated Software Engineering 2004, pp. 2–13. September 2004Google Scholar
  18. 18.
    Canfora, G., Cerulo, L.: Impact analysis by mining Software and Change Request Repositories. In: IEEE International Symposium on Software Metrics, pp. 20–29 (2005)Google Scholar
  19. 19.
    Orso, A., Shi, N., Harrold, M.J.: Scaling regression testing to large software systems. SIGSOFT Softw. Eng. Notes 29, 241–251 (2004)CrossRefGoogle Scholar
  20. 20.
    Bacon, D., Sweeney, P.: Fast static analysis of C++ virtual function calls. In: Proceedings of the Conference on Object-Oriented Programming Systems, Languages, and Applications, ACM SIGPLAN Notices, vol. 31, pp. 324–341. ACM Press, New York, October 1996Google Scholar
  21. 21.
    Lam, P., Bodden, E., Lhotak, O., Lhotak, J., Qian, F., Hendren, L.: Soot: A Java Optimization Framework. Sable Research Group, McGill University, Montreal, Canada, March 2010. http://www.sable.mcgill.ca/soot/
  22. 22.
    Chen, W., Iqbal, A., Abdrakhmanov, A., George, C., Lawford, M., Maibaum, T., Wassyng, A.: Report 7: Middleware Change Impact Analysis for Large-scale Enterprise Systems. Tech. Rep. 7, McMaster Centre for Software Certification (McSCert), September 2011Google Scholar
  23. 23.
    Christensen, A.S., Møller, A., Schwartzbach, M.I.: Precise analysis of string expressions. In: Cousot, R. (ed.) SAS 2003. LNCS, vol. 2694, pp. 1–18. Springer, Heidelberg (2003). http://www.brics.dk/JSA/
  24. 24.
    Tarjan, R.: Depth-first search and linear graph algorithms. SIAM J. Comput. 1(2), 146–160 (1972)CrossRefGoogle Scholar
  25. 25.
    Mockus, A., Weiss, D.M.: Predicting risk of software changes. Bell Labs Tech. J. 5(2), 169–180 (2000)CrossRefGoogle Scholar
  26. 26.
    Doar, M.B.: JDiff - An HTML Report of API Differences (2007). http://javadiff.sourceforge.net/
  27. 27.
  28. 28.
    Tessier, J.: The Dependency Finder User Manual, November 2010. http://depfind.sourceforge.net/Manual.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Wen Chen
    • 1
  • Asif Iqbal
    • 1
  • Akbar Abdrakhmanov
    • 1
  • Jay Parlar
    • 1
  • Chris George
    • 1
  • Mark Lawford
    • 1
  • Tom Maibaum
    • 1
  • Alan Wassyng
    • 1
  1. 1.McMaster Centre for Software CertificationMcMaster UniversityHamiltonCanada

Personalised recommendations