Journal of Computer Science and Technology

, Volume 21, Issue 6, pp 965–972 | Cite as

Code Based Analysis for Object-Oriented Systems

  • Swapan Bhattacharya
  • Ananya Kanjilal
Regular Paper


The basic features of object-oriented software makes it difficult to apply traditional testing methods in object-oriented systems. Control Flow Graph (CFG) is a well-known model used for identification of independent paths in procedural software. This paper highlights the problem of constructing CFG in object-oriented systems and proposes a new model named Extended Control Flow Graph (ECFG) for code based analysis of Object-Oriented (OO) software. ECFG is a layered CFG where nodes refer to methods rather than statements. A new metrics— Extended Cyclomatic Complexity (E-CC) is developed which is analogous to McCabe’s Cyclomatic Complexity (CC) and refers to the number of independent execution paths within the OO software. The different ways in which CFG’s of individual methods are connected in an ECFG are presented and formulas for E-CC for these different cases are proposed. Finally we have considered an example in Java and based on its ECFG, applied these cases to arrive at the E-CC of the total system as well as proposed a methodology for calculating the basis set, i.e., the set of independent paths for the OO system that will help in creation of test cases for code testing.


object-oriented testing extended control flow graph extended cyclomatic complexity test paths graph-based testing 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Ghezzi C, Jazayeri M, Mandrioli D. Fundamentals of Software Engineering. India: Prentice Hall, 1998.Google Scholar
  2. [2]
    Marie-Claude Gaudel. Testing from formal specifications, a generic approach. LNCS 2043, Springer-Verlag, May 2001, pp.35–48.Google Scholar
  3. [3]
    Tse T H, Zhinong Xu. Test case generation for class-level object-oriented testing. In Quality Process Convergence: Proc. 9th Int. Software Quality Week (QW’96), San Francisco, California, 1996, pp.4T4.0–4T4.12.Google Scholar
  4. [4]
    Tai K C, Lie Y. A test generation strategy for pairwise testing. IEEE TSE, January 2002, 28(1): 109–111.Google Scholar
  5. [5]
    Richard H Carver, Kuo-Chung Tai. Use of sequencing constraints for specification-based testing of concurrent programs. IEEE TSE, June 1998, 24(6): 471–490.Google Scholar
  6. [6]
    Murali Rangarajan, Perry Alexander, Nael Abu-Ghazaleh. Using automatable proof obligations for component based design checking. In Proc. IEEE Conf. Workshop on Engineering of Computer-Based Systems, Los Alamitos, USA, March 1999, p.304.Google Scholar
  7. [7]
    Stephane Barbey, Didier Buchs, Cecile Peraire. A theory of specification based testing for object-oriented software. In Proc. European Dependable Computing Conference (EDCC2), Taormina, Italy, October 1996.Google Scholar
  8. [8]
    Huo Yan Chen, T H Tse, Yue Tang Deng. ROCS: An object-oriented class-level testing system based on the relevant observable contexts technique. Information and Software Technology, July 2000, 42(10): 677–686.Google Scholar
  9. [9]
    Betty H C Cheng, Enoch Y Wang. Formalizing and integrating the dynamic model for object-oriented modeling. IEEE Trans. Software and Engineering, August 2002, 28(8): 747–762.CrossRefGoogle Scholar
  10. [10]
    Rajendra K Bandi, Vijay K Vaishnavi, Daniel E Turk. Predicting maintenance performance using object-oriented design complexity metrics. IEEE Trans. Software and Engineering, January 2003, 29(1): 77–87.CrossRefGoogle Scholar
  11. [11]
    Ramanath Subramanyam, Krishnan M S. Empirical analysis of CK metrics for object-oriented design complexity: Implications for software defects. IEEE Trans. Software and Engineering, April 2003, 29(4): 297–310.CrossRefGoogle Scholar
  12. [12]
    Aynur Abdurazik, Jeff Offut. Using UML collaboration diagrams for static checking and test generation. LNCS 1939/2000, UML 2000: The Unified Modeling Language Advancing the Standard, January 2000, p.383.Google Scholar
  13. [13]
    Phyllis G Frankl, Richard G Hamlet, Bev Littlewood, Lorenzo Strigini. Evaluating testing methods by delivered reliability. IEEE Trans. Software and Engineering, August 1998, 24(8): 586–601.CrossRefGoogle Scholar
  14. [14]
    Anita Goel, Gupta S C, Wasan S K. Probe mechanism for object-oriented software testing. In Proc. FASE 2003, 6th Int. Conf., Warsaw, Poland, April 7–11, 2003, pp.310–324.Google Scholar
  15. [15]
    Amie L Souter, Tiffany M Wong, Stacey A Shindo, Lori L Pollock. TATOO: Testing and analysis tool for object-oriented software. In Proc. the 7th Int. Conf. Tools and Algorithms for the Construction and Analysis of Systems, part of ETAPS, Genova, Italy, April, 2001, pp.389–403.Google Scholar
  16. [16]
    Jose L Fernandez. Acceptance testing of object-oriented systems. In Proc. Ada-Europe Int. Conf. Reliable Software Technologies, LNCS 1622, 1999, pp.114–123.Google Scholar
  17. [17]
    Joachim Wegener, Andre Baresel, Harmen Sthamer. Evolutionary test environment for automatic structural testing. Information and Software Technology, 2001, 43(14): 841–854.CrossRefGoogle Scholar
  18. [18]
    Bixin Li. A technique to analyze information-flow in object oriented programs. Information and Software Technology, Apr. 2001, 45(6): 841–854.Google Scholar
  19. [19]
    Boujarwah A S, Saeh K, Al-Dalla J. Dynamic data flow analysis for Java programs. Information and Software Technology, August 2003, 42(11): 765–775.CrossRefGoogle Scholar
  20. [20]
    Jessica Chen. On using static analysis in distributed system testing. LNCS Vol 1999/2001, Springer-Verlag, ISSN-0302-9743, Jan. 2001, p.145.Google Scholar
  21. [21]
    Saleh K, Abdel Aziz Boujarwah, Jehad Al-Dallal. Anomaly detection in concurrent Java programs using dynamic data flow analysis. Information and Software Technology, December 2001, 43(15): 973–981.CrossRefGoogle Scholar
  22. [22]
    Schieferdecker I, Jens Grabowski. The graphical format of TTCN-3 in the context of MSC and UML. LNCS, Vol 2599/2003, Springer-Verlag, January 2003, pp.233–252.Google Scholar
  23. [23]
    Bertolino A, Marre M. Automatic generation of path covers based on the control flow analysis of computer programs. IEEE Trans. Software and Engineering, 1994, 20(12): 885–899.CrossRefGoogle Scholar
  24. [24]
    Saurabh Sinha, Mary Jean Harrold. Analysis and testing of programs with exception handling constructs. IEEE Trans. Software and Engineering, September 2000, 26(9): 849–871.CrossRefGoogle Scholar
  25. [25]
    Matthew J Gallagher, V Lakshmi Narasimhan. ADTEST: A test data generation suite for Ada software systems. IEEE Trans. Software and Engineering, Aug. 1997, 26(9): 473–484.CrossRefGoogle Scholar
  26. [26]
    Pramod V Koppol, Richard H Carver, Kuo-Chung Tai. Incremental integration testing of concurrent programs. IEEE Trans. Software and Engineering, June 2002, 28(6): 607–623.CrossRefGoogle Scholar
  27. [27]
    Michael C C, Mcgraw G, Schatz M A. Generating software test data by evolution. IEEE Trans. Software and Engineering, December 2001, 27(12): 1085–1110.CrossRefGoogle Scholar
  28. [28]
    Narsingh Deo. Graph Theory. India: Prentice Hall, 2003.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2006

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringJadavpur UniversityKolkataIndia
  2. 2.Department of Information TechnologyB. P. Poddar Institute of Management and TechnologyKolkataIndia

Personalised recommendations