Skip to main content
Log in

Object coverage criteria for supporting object-oriented testing

  • Research
  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

Code coverage criteria are widely used in object-oriented (OO) domains as test quality indicators. However, these criteria are based on the procedural point of view, and therefore do not address the specific features of OO programs. In this article, we extend the code coverage criteria and introduce a new set of criterion, called “object coverage criteria,” which cope with OO features like object instantiation, inheritance, polymorphism, and dynamic binding. Unlike previous criteria, the new criteria regard the actual type of the object under test and some inherited codes from the parent/ancestor classes that represent the object’s states and behaviors. The new criteria have been implemented in a prototype tool called OCov4J for the Java language. Using this tool and conducting an empirical study on 270 classes (with about 50 k lines of code without blank lines and comments) from several large and widely used open source projects, we have found a considerable positive correlation between the object coverage level (defined via the new proposed criteria) and the number of detected specific OO failures. Not only do the proposed criteria provide ease of use, high automation, and low execution cost, but also they can effectively be applied to real-world OO programs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Availability of data and materials

The data that support the findings of this study are openly available in the GitHub repository mu-runner at https://github.com/sbu-test-lab/mu-runner. Also the coverage extraction tool OCov4J is available as an open-source project at https://github.com/sbu-test-lab/ocov4j.

Notes

  1. https://github.com/jacoco/jacoco

  2. https://github.com/sbu-test-lab/ocov4j.git

  3. https://asm.ow2.io/

  4. https://github.com/BasLeijdekkers/MetricsReloaded

  5. https://github.com/sbu-test-lab/jtetris

  6. https://commons.apache.org/validator/

  7. https://commons.apache.org/bcel/

  8. http://commons.apache.org/collections/

  9. https://github.com/google/guava

  10. https://github.com/ta4j/ta4j

  11. https://github.com/sbu-test-lab/mu-runner

References

  • Alexander, R. T., & Offutt, A. J. (2000, October). Criteria for testing polymorphic relationships. In Proceedings 11th International Symposium on Software Reliability Engineering. ISSRE 2000 (pp. 15–23). IEEE.

  • Alexander, R. T., Offutt, J., & Bieman, J. M. (2002, December). Syntactic fault patterns in oo programs. In Eighth IEEE International Conference on Engineering of Complex Computer Systems (pp. 193–202). IEEE.

  • Alexander, R. T., Offutt, J., & Stefik, A. (2010). Testing coupling relationships in object-oriented programs. Software Testing, Verification and Reliability, 20(4), 291–327.

    Article  Google Scholar 

  • Ammann P, Offutt J. (2016, December 13). Introduction to software testing. Cambridge University Press.

  • Aziz, S. R., Khan, T., & Nadeem, A. (2019). Experimental validation of inheritance metrics’ impact on software fault prediction. IEEE Access, 7, 85262–85275.

    Article  Google Scholar 

  • Binder, R. (2000). Testing object-oriented systems: Models, patterns, and tools. Addison-Wesley Professional.

  • Bloch, J. (2008). Effective java (the java series). Prentice Hall PTR.

  • Braione, P., & Denaro, G. (2019, May). SUSHI and TARDIS at the SBST2019 tool competition. In 2019 IEEE/ACM 12th International Workshop on Search-Based Software Testing (SBST) (pp. 25–28). IEEE.

  • Coles, H., Laurent, T., Henard, C., Papadakis, M., & Ventresque, A. (2016, July). Pit: A practical mutation testing tool for java. In Proceedings of the 25th international symposium on software testing and analysis (pp. 449–452).

  • Denaro, G., Margara, A., Pezze, M., & Vivanti, M. (2015, May). Dynamic data flow testing of object oriented systems. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering (Vol. 1, pp. 947–958). IEEE.

  • Devroey, X., Panichella, S., & Gambi, A. (2020, June). Java unit testing tool competition: Eighth round. In Proceedings of the IEEE/ACM 42nd International Conference on Software Engineering Workshops (pp. 545–548).

  • Fraser, G., & Arcuri, A. (2011, September). Evosuite: Automatic test suite generation for object-oriented software. In Proceedings of the 19th ACM SIGSOFT symposium and the 13th European conference on Foundations of software engineering (pp. 416–419).

  • Fraser, G., & Arcuri, A. (2015). 1600 faults in 100 projects: Automatically finding faults while achieving high coverage with evosuite. Empirical Software Engineering, 20(3), 611–639.

    Article  Google Scholar 

  • Gamma, E., Johnson, R., Helm, R., Johnson, R. E., & Vlissides, J. (1995). Design patterns: Elements of reusable object-oriented software. Pearson Deutschland GmbH.

  • Gay, G., Staats, M., Whalen, M., & Heimdahl, M. P. (2015). The risks of coverage-directed test case generation. IEEE Transactions on Software Engineering, 41(8), 803–819.

    Article  Google Scholar 

  • Ghoreshi, M., & Haghighi, H. (2016). An incremental method for extracting tests from object-oriented specification. Information and Software Technology, 78, 1–26.

    Article  Google Scholar 

  • Gopinath, R., Jensen, C., & Groce, A. (2014, May). Code coverage for suite evaluation by developers. In Proceedings of the 36th International Conference on Software Engineering (pp. 72–82).

  • Harrold, M. J., McGregor, J. D., & Fitzpatrick, K. J. (1992, June). Incremental testing of object-oriented class structures. In Proceedings of the 14th international conference on Software engineering (pp. 68–80).

  • Hemmati, H. (2015, August). How effective are code coverage criteria? In 2015 IEEE International Conference on Software Quality, Reliability and Security (pp. 151–156). IEEE.

  • Just, R., Jalali, D., & Ernst, M. D. (2014, July). Defects4J: A database of existing faults to enable controlled testing studies for Java programs. In Proceedings of the 2014 International Symposium on Software Testing and Analysis (pp. 437–440).

  • Kifetew, F., Devroey, X., & Rueda, U. (2019, May). Java unit testing tool competition-seventh round. In 2019 IEEE/ACM 12th International Workshop on Search-Based Software Testing (SBST) (pp. 15–20). IEEE.

  • Kim, S. W., Clark, J. A., & McDermid, J. A. (2001). Investigating the effectiveness of object-oriented testing strategies using the mutation method. Software Testing, Verification and Reliability, 11(4), 207–225.

    Article  Google Scholar 

  • Liu, B., Ge, H., Chen, J., & Bao, Q. (2019, November). An automatic testing platform for object-oriented software based on code coverage. In Proceedings of the 2019 the 9th International Conference on Communication and Network Security (pp. 20–24).

  • Ma, Y. S., Offutt, J., & Kwon, Y. R. (2006, May). MuJava: A mutation system for Java. In Proceedings of the 28th international conference on Software engineering (pp. 827–830).

  • Madeyski, L., & Radyk, N. (2010). Judy–a mutation testing tool for Java. IET Software, 4(1), 32–42.

    Article  Google Scholar 

  • Mcheick, H., Dhiab, H., Dbouk, M., & Mcheik, R. (2010, May). Detecting type errors and secure coding in C/C++ applications. In ACS/IEEE International Conference on Computer Systems and Applications-AICCSA 2010 (pp. 1–9). IEEE.

  • Molina, U. R., Kifetew, F., & Panichella, A. (2018, May). Java unit testing tool competition-sixth round. In 2018 IEEE/ACM 11th International Workshop on Search-Based Software Testing (SBST) (pp. 22–29). IEEE.

  • Najumudheen, E. S. F., Mall, R., & Samanta, D. (2019, February). Modeling and coverage analysis of programs with exception handling. In Proceedings of the 12th Innovations on Software Engineering Conference (formerly known as India Software Engineering Conference) (pp. 1–11).

  • Offutt, J., Alexander, R., Wu, Y., Xiao, Q., & Hutchinson, C. (2001, November). A fault model for subtype inheritance and polymorphism. In Proceedings 12th International Symposium on Software Reliability Engineering (pp. 84–93). IEEE.

  • Offutt, J., Ma, Y. S., & Kwon, Y. R. (2006, May). The class-level mutants of MuJava. In Proceedings of the 2006 international workshop on Automation of software test (pp. 78–84).

  • Orso, A., & Pezze, M. (1999, June). Integration testing of procedural object-oriented languages with polymorphism. In Proceedings of the 16th International Conference on Testing Computer Software: Future Trends in Testing (TCS’99).

  • Pacheco, C., & Ernst, M. D. (2007, October). Randoop: Feedback-directed random testing for Java. In Companion to the 22nd ACM SIGPLAN conference on Object-oriented programming systems and applications companion (pp. 815–816).

  • Papadakis, M., Kintis, M., Zhang, J., Jia, Y., Le Traon, Y., & Harman, M. (2019). Mutation testing advances: An analysis and survey. In Advances in Computers (Vol. 112, pp. 275–378). Elsevier.

  • Perry, D. E., & Kaiser, G. E. (1990). Adequate testing and object-oriented programming. Journal of Object-Oriented Programming, 2(5), 13–19.

    Google Scholar 

  • Prasetya, I. W. B. (2015, August). T3i: A tool for generating and querying test suites for java. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (pp. 950–953).

  • Razali, N. M., & Wah, Y. B. (2011). Power comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling tests. Journal of Statistical Modeling and Analytics, 2(1), 21–33.

    Google Scholar 

  • Saha, R. K., Lyu, Y., Lam, W., Yoshida, H., & Prasad, M. R. (2018, May). Bugs. jar: A large-scale, diverse dataset of real-world java bugs. In Proceedings of the 15th international Conference on Mining Software Repositories (pp. 10–13).

  • Schwartz, A., Puckett, D., Meng, Y., & Gay, G. (2018). Investigating faults missed by test suites achieving high code coverage. Journal of Systems and Software, 144, 106–120.

    Article  Google Scholar 

  • Segura, S., Hierons, R. M., Benavides, D., & Ruiz-Cortés, A. (2011). Mutation testing on an object-oriented framework: An experience report. Information and Software Technology, 53(10), 1124–1136.

    Article  Google Scholar 

  • Smith, G. (2012). The Object-Z specification language (Vol. 1). Springer Science & Business Media.

  • Smith, M. D., & Robson, D. J. (1990, November). Object-oriented programming-the problems of validation. In Proceedings Conference on Software Maintenance 1990 (pp. 272–281). IEEE.

  • Su, T., Wu, K., Miao, W., Pu, G., He, J., Chen, Y., & Su, Z. (2017). A survey on data-flow testing. ACM Computing Surveys (CSUR), 50(1), 1–35.

    Article  Google Scholar 

  • Zou, Y., Chen, Z., Zheng, Y., Zhang, X., & Gao, Z. (2014, July). Virtual DOM coverage for effective testing of dynamic web applications. In Proceedings of the 2014 International Symposium on Software Testing and Analysis (pp. 60–70).

Download references

Author information

Authors and Affiliations

Authors

Contributions

Mohammad Ghoreshi has presented the research idea, implemented it, and performed related experiments. Hassan Haghighi designed the research and evaluation methodology, introduced the test data, and supervised the entire research process. Both authors discussed and validated the results and contributed to the final manuscript.

Corresponding author

Correspondence to H. Haghighi.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ghoreshi, M., Haghighi, H. Object coverage criteria for supporting object-oriented testing. Software Qual J 31, 1369–1414 (2023). https://doi.org/10.1007/s11219-023-09643-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11219-023-09643-3

Keywords

Navigation