Software Validation via Model Animation

  • Aaron M. Dutle
  • César A. Muñoz
  • Anthony J. Narkawicz
  • Ricky W. Butler
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9154)


This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system’s algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.


Test Point Software Implementation Symbolic Execution Generate Test Case Ground Speed 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Federal Aviation Administration. Airworthiness approval of automatic dependent surveillance-broadcast (ADS-B) out systems. Advisory Circular AC 20–165A, FAA (November 2012)Google Scholar
  2. 2.
    Aichernig, B.K., Gerstinger, A., Aster, R.: Formal specification techniques as a catalyst in validation. In: Fifth IEEE International Symposim on High Assurance Systems Engineering, HASE 2000, pp. 203–206. IEEE (2000)Google Scholar
  3. 3.
    Berghofer, S., Nipkow, T.: Random testing in Isabelle/HOL. In: Cuellar, J., Liu, Z. (eds.) Software Engineering and Formal Methods, SEFM 2004, pp. 230–239. IEEE Computer Society (2004)Google Scholar
  4. 4.
    Boldo, S.: Deductive formal verification: how to make your floating-point programs behave. Thèse d’habilitation, Université Paris-Sud (October 2014)Google Scholar
  5. 5.
    Boldo, S., Marché, C.: Formal Verification of Numerical Programs: from C Annotated Programs to Mechanical Proofs. Mathematics in Computer Science 5, 377–393 (2011)CrossRefGoogle Scholar
  6. 6.
    Butler, R.: Formalization of the integral calculus in the PVS theorem prover. Journal of Formalized Reasoning 2(1) (2009)Google Scholar
  7. 7.
    Cadar, C., Godefroid, P., Khurshid, S., Păsăreanu, C.S., Sen, K., Tillmann, N., Visser, W.: Symbolic execution for software testing in practice: preliminary assessment. In: Proceedings of the 33rd International Conference on Software Engineering, ICSE 2011, pp. 1066–1071. ACM, New York (2011)Google Scholar
  8. 8.
    Claessen, K., Hughes, J.: QuickCheck: a lightweight tool for random testing of Haskell programs. In: Proceedings of the Fifth ACM SIGPLAN International Conference on Functional Programming, ICFP 2000, pp. 268–279. ACM, New York (2000)Google Scholar
  9. 9.
    Crow, J., Owre, S., Rushby, J., Shankar, N., Stringer-Calvert, D.: Evaluating, testing, and animating PVS specifications. Technical report, Computer Science Laboratory, SRI International, Menlo Park, CA (March 2001)Google Scholar
  10. 10.
    Hagen, G., Butler, R., Maddalon, J.: Stratway: a modular approach to strategic conflict resolution. In: Preceedings of 11th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference, Virgina Beach, VA (September 2011)Google Scholar
  11. 11.
    Hagen, G.E., Butler, R.W.: Towards a formal semantics of flight plans and trajectories. Technical Memorandum NASA/TM-2014-218862, NASA, Langley Research Center, Hampton VA 23681–2199, USA (December 2014)Google Scholar
  12. 12.
    Hayhurst, K.J., Veerhusen, D.S., Chilenski, J.J., Rierson, L.K.: A practical tutorial on modified condition/decision coverage. Technical Memorandum NASA/TM-2001-210876, NASA, Langley Research Center, Hampton VA 23681–2199, USA (May 2001)Google Scholar
  13. 13.
    Lensink, L., Smetsers, S., van Eekelen, M.: Generating verifiable java code from verified PVS specifications. In: Goodloe, A.E., Person, S. (eds.) NFM 2012. LNCS, vol. 7226, pp. 310–325. Springer, Heidelberg (2012) CrossRefGoogle Scholar
  14. 14.
    Leuschel, M., Butler, M.: ProB: a model checker for B. In: Araki, K., Gnesi, S., Mandrioli, D. (eds.) FME 2003. LNCS, vol. 2805, pp. 855–874. Springer, Heidelberg (2003) CrossRefGoogle Scholar
  15. 15.
    Marché, C.: Verification of the functional behavior of a floating-point program: an industrial case study. Science of Computer Programming 96(3), 279–296 (2014)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Masci, P., Oladimeji, P., Curzon, P., Thimbleby, H.: Tool demo: using PVSio-web to demonstrate software issues in medical user interfaces. In: 4th International Symposium on Foundations of Healthcare Information Engineering and Systems (FHIES2014) (2014)Google Scholar
  17. 17.
    Meyer, B.: Applying “Design by Contract”. Computer 25(10), 40–51 (1992)CrossRefGoogle Scholar
  18. 18.
    Meyer, B., Fiva, A., Ciupa, I., Leitner, A., Wei, Y., Stapf, E.: Programs that test themselves. Computer 42(9), 46–55 (2009)CrossRefGoogle Scholar
  19. 19.
    Muñoz, C.: Rapid prototyping in PVS. Contractor Report NASA/CR-2003-212418, NASA, Langley Research Center, Hampton VA 23681–2199, USA (May 2003)Google Scholar
  20. 20.
    Narkawicz, A., Muñoz, C.: State-based implicit coordination and applications. Technical Publication NASA/TP-2011-217067, NASA, Langley Research Center, Hampton VA 23681–2199, USA (March 2011)Google Scholar
  21. 21.
    Owre, S., Rushby, J., Shankar, N.: PVS: a prototype verification. In: Kapur, D. (ed.) CADE 1992. LNCS, vol. 607, pp. 748–752. Springer, Heidelberg (1992) Google Scholar
  22. 22.
    Palanque, P., Ladry, J.-F., Navarre, D., Barboni, E.: High-Fidelity prototyping of interactive systems can be formal too. In: Jacko, J.A. (ed.) HCI International 2009, Part I. LNCS, vol. 5610, pp. 667–676. Springer, Heidelberg (2009) Google Scholar
  23. 23.
    Sen, K., Marinov, D., Agha, G.: CUTE: A concolic unit testing engine for C. In: Proceedings of the 10th European Software Engineering Conference Held Jointly with 13th ACM SIGSOFT International Symposium on Foundations of Software Engineering, ESEC/FSE-13, pp. 263–272. ACM, New York (2005)Google Scholar
  24. 24.
    Shankar, N.: Efficiently executing PVS. Technical report, Project report, ComputerScience Laboratory, SRI International, Menlo Park (1999)Google Scholar
  25. 25.
    Yang, F., Jacquot, J.-P., Souquières, J.: Jeb: safe simulation of event-b models in javascript. In: 2013 20th Asia-Pacific Software Engineering Conference (APSEC), vol. 1, pp. 571–576 (December 2013)Google Scholar
  26. 26.
    Yusuke, W., Shigeru, K.: Performance evaluation of a testing framework using quickcheck and hadoop. IPSJ Journal 53(2), 7 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Aaron M. Dutle
    • 1
  • César A. Muñoz
    • 1
  • Anthony J. Narkawicz
    • 1
  • Ricky W. Butler
    • 1
  1. 1.NASA Langley Research CenterHamptonUSA

Personalised recommendations