Advertisement

Verification Coverage for Combining Test and Proof

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10889)

Abstract

The V&V practices of safety-critical industries (e.g. avionics) are currently based on either unit testing or unit proof to verify that a function satisfies its low-level requirements in order to be compliant with the highest certification levels [26] (e.g. DO-178C level A for avionic software). In this context, the verification engineer must assess sufficient coverage of both code (structural coverage) and specification (functional coverage). However, there is no shared method for test and proof to measure structural coverage. In practice, this prevents the verification engineer from combining test and automatic proof to verify low-level requirements of a common piece of code in order to mitigate the verification cost. This paper fills this gap between test and proof by introducing a new notion of verification coverage based on mutation coverage. It subsumes functional coverage and structural coverage for both unit testing and unit proof. Consequently, it allows the verification engineer to mix test tools and automatic provers in the verification process for the sake of reducing verification cost, in the sense that the more automation is used during the verification, the less resource is spent to verify the program.

Keywords

Coverage criteria Combining test and proof 

Notes

Acknowledgment

The authors would like to thank the anonymous reviewers for their helpful comments and feedbacks. This work was partly supported by project VESSEDIA, which has received funding from the European Union’s Horizon 2020 Research and Innovation Program under grant agreement No 731453. It was also partly supported by European Union’s Involvement in Midi-Pyrénées through its Development Fund.

References

  1. 1.
    Agrawal, H., Demillo, R.A., Hathaway, B., Hsu, W., Hsu, W., Krauser, E.W., Martin, R.J., Mathur, A., Spafford, E.: Design of mutant operators for the C programming language. Technical report, SERC-TR-41-P, Purdue University (1999)Google Scholar
  2. 2.
    Ammann, P., Offutt, J.: Introduction to Software Testing. Cambridge University Press, Cambridge (2008)CrossRefGoogle Scholar
  3. 3.
    Artho, C., Biere, A.: Combined static and dynamic analysis. Electron. Notes Theor. Comput. Sci. (ENTCS) 131, 3–14 (2005)CrossRefGoogle Scholar
  4. 4.
    Bardin, S., Delahaye, M., David, R., Kosmatov, N., Papadakis, M., Traon, Y.L., Marion, J.Y.: Sound and quasi-complete detection of infeasible test requirements. In: IEEE 8th International Conference on Software Testing, Verification and Validation, ICST 2015, pp. 1–10 (2015)Google Scholar
  5. 5.
    Bardin, S., Chebaro, O., Delahaye, M., Kosmatov, N.: An all-in-one toolkit for automated white-box testing. In: Seidl, M., Tillmann, N. (eds.) TAP 2014. LNCS, vol. 8570, pp. 53–60. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-09099-3_4CrossRefGoogle Scholar
  6. 6.
    Bardin, S., Kosmatov, N., Cheynier, F.: Efficient leveraging of symbolic execution to advanced coverage criteria. In: IEEE 7th International Conference on Software Testing, Verification and Validation, ICST 2014, pp. 173–182 (2014)Google Scholar
  7. 7.
    Baudin, P., Filliâtre, J.C., Marché, C., Monate, B., Moy, Y., Prevosto, V.: ACSL: ANSI/ISO C Specification Language. http://frama-c.com/acsl.html
  8. 8.
    Bishop, P., Bloomfield, R., Cyra, L.: Combining testing and proof to gain high assurance in software: a case study. In: IEEE International Symposium on Software Reliability Engineering (ISSRE), pp. 248–257 (2013)Google Scholar
  9. 9.
    Brahmi, A., Delmas, D., Essoussi, M.H., Randimbivololona, F., Atki, A., Marie, T.: Formalise to automate: deployment of a safe and cost-efficient process for avionics software. In: European Congress on Embedded Real Time Software and Systems (ERTSS 2018) (2018)Google Scholar
  10. 10.
    Brown, D., Delseny, H., Hayhurst, K., Wiels, V.: Guidance for using formal methods in a certification context. In: European Congress on Embedded Real Time Software and Systems (ERTS 2010) (2010)Google Scholar
  11. 11.
    Chockler, H., Kupferman, O., Vardi, M.Y.: Coverage metrics for formal verification. In: Geist, D., Tronci, E. (eds.) CHARME 2003. LNCS, vol. 2860, pp. 111–125. Springer, Heidelberg (2003).  https://doi.org/10.1007/978-3-540-39724-3_11CrossRefzbMATHGoogle Scholar
  12. 12.
    Dadeau, F., Giorgetti, A., Bouquet, F., Enderlin, I.: Contract-based testing for PHP with Praspel. J. Syst. Softw. 136, 209–222 (2018)CrossRefGoogle Scholar
  13. 13.
    Dadeau, F., Ledru, Y., du Bousquet, L.: Measuring a Java test suite coverage using JML specifications. Electron. Notes Theor. Comput. Sci. 190, 21–32 (2007)CrossRefGoogle Scholar
  14. 14.
    Delahaye, M., Kosmatov, N., Signoles, J.: Common specification language for static and dynamic analysis of C programs. In: Symposium on Applied Computing (SAC 2013), pp. 1230–1235. ACM (2013)Google Scholar
  15. 15.
    Delamaro, M.E., Deng, L., Durelli, V.H.S., Li, N., Offutt, J.: Experimental evaluation of SDL and One-Op mutation for C. In: IEEE 7th International Conference on Software Testing, Verification and Validation, ICST 2014, pp. 203–212 (2014)Google Scholar
  16. 16.
    DeMillo, R.A., Lipton, R.J., Sayward, F.G.: Hints on test data selection: help for the practicing programmer. IEEE Comput. 11, 34–41 (1978)CrossRefGoogle Scholar
  17. 17.
    Filliâtre, J.C.: Deductive software verification. Int. J. Softw. Tools Technol. Transf. 13, 397 (2011)CrossRefGoogle Scholar
  18. 18.
    Floyd, R.W.: Assigning meanings to programs. In: Colburn, T.R., Fetzer, J.H., Rankin, T.L. (eds.) Program Verification, pp. 65–81. Springer, Dordrecht (1993).  https://doi.org/10.1007/978-94-011-1793-7_4CrossRefGoogle Scholar
  19. 19.
    Hoare, C.A.R.: An axiomatic basis for computer programming. In: Gries, D. (ed.) Programming Methodology, pp. 89–100. Springer, New York (1978).  https://doi.org/10.1007/978-1-4612-6315-9_9CrossRefGoogle Scholar
  20. 20.
    Kirchner, F., Kosmatov, N., Prevosto, V., Signoles, J., Yakobowski, B.: Frama-C: a software analysis perspective. Formal Aspects Comput. 27, 573–609 (2015)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Kiss, B., Kosmatov, N., Pariente, D., Puccetti, A.: Combining static and dynamic analyses for vulnerability detection: illustration on Heartbleed. In: Piterman, N. (ed.) HVC 2015. LNCS, vol. 9434, pp. 39–50. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-26287-1_3CrossRefGoogle Scholar
  22. 22.
    Kosmatov, N., Signoles, J.: Runtime assertion checking and its combinations with static and dynamic analyses. In: Seidl, M., Tillmann, N. (eds.) TAP 2014. LNCS, vol. 8570, pp. 165–168. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-09099-3_13CrossRefGoogle Scholar
  23. 23.
    Marcozzi, M., Bardin, S., Delahaye, M., Kosmatov, N., Prevosto, V.: Taming coverage criteria heterogeneity with LTest. In: IEEE 10th International Conference on Software Testing, Verification and Validation, ICST 2017, pp. 500–507 (2017)Google Scholar
  24. 24.
    Marcozzi, M., Delahaye, M., Bardin, S., Kosmatov, N., Prevosto, V.: Generic and effective specification of structural test objectives. In: IEEE 10th International Conference on Software Testing, Verification and Validation, ICST 2017, pp. 436–441 (2017)Google Scholar
  25. 25.
    Morell, L.J.: A theory of fault-based testing. IEEE Trans. Softw. Eng. 16, 844–857 (1990)CrossRefGoogle Scholar
  26. 26.
    Moy, Y., Ledinot, E., Delseny, H., Wiels, V., Monate, B.: Testing or formal verification: DO-178C alternatives and industrial experience. IEEE Softw. 30, 50–57 (2013)CrossRefGoogle Scholar
  27. 27.
    Offutt, A.J., Voas, J., Payne, J.: Mutation operators for Ada. Technical report ISSE-TR-96-09 (1996)Google Scholar
  28. 28.
    Peters, D.K., Parnas, D.L.: Using test oracles generated from program documentation. IEEE Trans. Softw. Eng. 24, 161–173 (1998)CrossRefGoogle Scholar
  29. 29.
    RTCA (Firm). and EUROCAE (Agency).: DO-178C, Software Considerations in Airborne Systems and Equipment Certification. Document (RTCA (Firm))) (2011)Google Scholar
  30. 30.
    RTCA (Firm). SC-205 and EUROCAE (Agency). Working Group 71: Formal Methods Supplement to DO-178C and DO-278A. Document (RTCA (Firm))), RTCA, Incorporated (2011). https://books.google.fr/books?id=nYhyMwEACAAJ
  31. 31.
    Signoles, J., Kosmatov, N., Vorobyov, K.: E-ACSL, a runtime verification tool for safety and security of C programs. Tool paper. In: International Workshop on Competitions, Usability, Benchmarks, Evaluation, and Standardisation for Runtime Verification Tools (RV-CuBES 2017), pp. 164–173 (2017)Google Scholar
  32. 32.
    Software and Engineering Standards Committee: IEEE Standard for Software and System Test Documentation. IEEE Std 829-2008, pp. 1–118 (2008)Google Scholar
  33. 33.
    Williams, N., Marre, B., Mouy, P., Roger, M.: PathCrawler: automatic generation of path tests by combining static and dynamic analysis. In: Dal Cin, M., Kaâniche, M., Pataricza, A. (eds.) EDCC 2005. LNCS, vol. 3463, pp. 281–292. Springer, Heidelberg (2005).  https://doi.org/10.1007/11408901_21CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.CEA MiPyLabègeFrance
  2. 2.CEA LIST, Software Reliability and Security LaboratoryGif-sur-YvetteFrance
  3. 3.ONERAPalaiseauFrance

Personalised recommendations