State Coverage: Software Validation Metrics beyond Code Coverage

  • Dries Vanoverberghe
  • Jonathan de Halleux
  • Nikolai Tillmann
  • Frank Piessens
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7147)


Currently, testing is still the most important approach to reduce the amount of software defects. Software quality metrics help to prioritize where additional testing is necessary by measuring the quality of the code. Most approaches to estimate whether some unit of code is sufficiently tested are based on code coverage, which measures what code fragments are exercised by the test suite. Unfortunately, code coverage does not measure to what extent the test suite checks the intended functionality.

We propose state coverage, a metric that measures the ratio of state updates that are read by assertions with respect to the total number of state updates, and we present efficient algorithms to measure state coverage. Like code coverage, state coverage is simple to understand and we show that it is effective to measure and easy to aggregate. During a preliminary evaluation on several open-source libraries, state coverage helped to identify multiple unchecked properties and detect several bugs.


state coverage test adequacy metric test oracle 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Barnett, G., Del Tongo, L.: Data Structures and Algorithms: Annotated Reference with Examples. NETSlackers (2008)Google Scholar
  2. 2.
    Barnett, G., Del Tongo, L.: Data structures and algorithms, dsa (2008),
  3. 3.
    Catal, C., Diri, B.: A systematic review of software fault prediction studies. Expert Systems with Applications 36(4), 7346–7354 (2009)CrossRefGoogle Scholar
  4. 4.
    Chang, J., Richardson, D.J., Sankar, S.: Structural specification-based testing with adl. In: Proceedings of the 1996 ACM SIGSOFT International Symposium on Software Testing and Analysis, ISSTA 1996, New York, NY, USA, pp. 62–70 (1996)Google Scholar
  5. 5.
    Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20(6), 476–493 (1994)CrossRefGoogle Scholar
  6. 6.
    Dadeau, F., Ledru, Y., du Bousquet, L.: Measuring a java test suite coverage using jml specifications. Electronic Notes in Theoretical Computer Science 190(2), 21–32 (2007); Proceedings of the Third Workshop on Model Based Testing CrossRefGoogle Scholar
  7. 7.
    de Halleux, J.: Quickgraph: A 100% c# graph library with graphviz support (2007),
  8. 8.
    DeMillo, R.A., Lipton, R.J., Sayward, F.G.: Hints on test data selection: Help for the practicing programmer. Computer 11(4), 34–41 (1978)CrossRefGoogle Scholar
  9. 9.
    Fähndrich, M., Barnett, M., Logozzo, F.: Embedded contract languages. In: SAC 2010: Proceedings of the 2010 ACM Symposium on Applied Computing, New York, NY, USA, pp. 2103–2110 (2010)Google Scholar
  10. 10.
    Floyd, R.W.: Assigning meanings to programs. Mathematical Aspects of Computer Science 19(19-32), 1 (1967)zbMATHGoogle Scholar
  11. 11.
    Heimdahl, M.P., George, D., Weber, R.: Specification test coverage adequacy criteria = specification test generation inadequacy criteria? In: IEEE International Symposium on High-Assurance Systems Engineering, pp. 178–186 (2004)Google Scholar
  12. 12.
    Hoare, C.A.R.: Assertions: A personal perspective. IEEE Ann. Hist. Comput. 25(2), 14–25 (2003)MathSciNetCrossRefGoogle Scholar
  13. 13.
    King, J.C.: Symbolic execution and program testing. Commun. ACM 19(7), 385–394 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Koster, K., Kao, D.: State coverage: a structural test adequacy criterion for behavior checking. In: The 6th Joint Meeting on European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering: Companion Papers, ESEC-FSE Companion 2007, New York, NY, USA, pp. 541–544 (2007)Google Scholar
  15. 15.
    Kudrjavets, G., Nagappan, N., Ball, T.: Assessing the relationship between software assertions and faults: An empirical investigation. In: ISSRE 2006: Proceedings of the 17th International Symposium on Software Reliability Engineering, pp. 204–212. IEEE Computer Society, Washington, DC, USA (2006)CrossRefGoogle Scholar
  16. 16.
    McCabe, T.J.: A complexity measure. IEEE Trans. Softw. Eng. 2(4), 308–320 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    N.I. of Standards and technology. The economic impacts of inadequate infrastructure for software testing. Planning Report 02-3 (2002)Google Scholar
  18. 18.
    Osherove, R.: The Art of Unit Testing with examples in .NET. Manning Publications Co. (2009)Google Scholar
  19. 19.
    Rapps, S., Weyuker, E.J.: Selecting software test data using data flow information. IEEE Trans. Softw. Eng. 11, 367–375 (1985)CrossRefzbMATHGoogle Scholar
  20. 20.
    Rosenblum, D.: A practical approach to programming with assertions. IEEE Transactions on Software Engineering 21(1), 19–31 (1995)CrossRefGoogle Scholar
  21. 21.
    Sabelfeld, A., Myers, A.C.: Language-based information-flow security. IEEE Journal on Selected Areas in Communications 21(1), 5–19 (2003)CrossRefGoogle Scholar
  22. 22.
    Song, Y., Thummalapenta, S., Xie, T.: Unitplus: assisting developer testing in eclipse. In: Eclipse 2007: Proceedings of the 2007 OOPSLA Workshop on Eclipse Technology Exchange, New York, NY, USA, pp. 26–30 (2007)Google Scholar
  23. 23.
    Taylor, R.N.: Assertions in programming languages. SIGPLAN Not. 15(1), 105–114 (1980)CrossRefGoogle Scholar
  24. 24.
    Tillmann, N., de Halleux, J.: Pex–White Box Test Generation for .NET. In: Beckert, B., Hähnle, R. (eds.) TAP 2008. LNCS, vol. 4966, pp. 134–153. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  25. 25.
    Vanoverberghe, D., de Halleux, J., Tillmann, N., Piessens, F.: State coverage: Software validation metrics beyond code coverage - extended version (2011),
  26. 26.
    Zhu, H., Hall, P.A.V., May, J.H.R.: Software unit test coverage and adequacy. ACM Comput. Surv. 29, 366–427 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Dries Vanoverberghe
    • 1
  • Jonathan de Halleux
    • 2
  • Nikolai Tillmann
    • 2
  • Frank Piessens
    • 1
  1. 1.Katholieke Universiteit LeuvenLeuvenBelgium
  2. 2.Microsoft ResearchRedmondUSA

Personalised recommendations