Formal Methods and their Role in the Certification of Critical Systems

  • John Rushby

Abstract

This article describes the rationale for formal methods and considers the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications. It suggests factors for consideration when formal methods are offered in support of certification in a context such as DO-178B (the guidelines for software used on board civil aircraft) [40]. The presentation is intended for those to whom these topics are new. A more technical discussion of formal methods is available as a technical report [42].

Keywords

Fatigue Europe Coherence Assure Sorting 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Biddle W. Barons of the Sky: From Early Flight to Strategic Warfare, the Story of the American Aerospace Industry. Simon and Schuster, New York, NY, 1991. Paperback edition by Henry Holt, 1993.Google Scholar
  2. [2]
    Boeing Commercial Airplane Group. Statistical Summary of Commercial Jet Aircraft Accidents, Worldwide Operations,1959–1994. Seattle, WA, Mar. 1995. Published annually by Boeing Airplane Safety Engineering (B-210B).Google Scholar
  3. [3]
    Brazendale J, and Jeffs AR. Out of control: Failures involving control systems. High Integrity Systems 1, 1 (1994), 67–72.Google Scholar
  4. [4]
    Burns R W. Genius at work. IEE Review 39, 5 (Sept. 1993), 187–189.CrossRefGoogle Scholar
  5. [5]
    Butler R W, and Finelli G B. The infeasibility of experimental quantification of life-critical software reliability. IEEE Transactions on Software Engineering 19, 1 (Jan. 1993), 3–12.CrossRefGoogle Scholar
  6. [6]
    Clarke E, Grumberg O, and Long D. Verification tools for finite-state concurrent systems. In A Decade of Concurrency (REX Workshop, Mook, The Netherlands, June 1994), J. W. de Bakker, W. P. de Roever, and G. Rozenberg, Eds., vol. 803 of Lecture Notes in Computer Science, Springer-Verlag, pp. 124–175.Google Scholar
  7. [7]
    Clarke E M, Grumberg O, Haraishi H, Jha S, Long D E, McMillan K L, and Ness L A. Verification of the Futurebus+ cache coherence protocol. Formal Methods in System Design 6, 2 (Mar. 1995), 217–232.CrossRefGoogle Scholar
  8. [8]
    Cooper Jr. H S F. The Evening Star: Venus Observed. Farrar Straus Giroux, New York, NY, 1993.Google Scholar
  9. [9]
    Cyrluk D, Rajan S, Shankar N, and Srivas M K. Effective theorem proving for hardware verification. In Theorem Provers in Circuit Design (TPCD ‘84) (Bad Herrenalb, Germany, Sept. 1994), R. Kumar and T. Kropf, Eds., vol. 910 of Lecture Notes in Computer Science, Springer-Verlag, pp. 203–222.Google Scholar
  10. [10]
    Dill D L, Drexler A J, Hu A J, and Yang C H. Protocol verification as a hardware design aid. In 1992 IEEE International Conference on Computer Design: VLSI in Computers and Processors (1992), IEEE Computer Society, pp. 522–525. Cambridge, MA, October 11–14.Google Scholar
  11. [11]
    Dornheim M A. X-31 flight tests to explore combat agility to 70 deg. AOA. Aviation Week and Space Technology (Mar. 11, 1991), 38–41.Google Scholar
  12. [12]
    Eckhardt D E, Caglayan A K, Knight J C, Lee L D, McAllister D F, Vouk M A, and Kelly J P J. An experimental evaluation of software redundancy as a strategy for improving reliability. IEEE Transactions on Software Engineering 17,7 (July 1991), 692–702.CrossRefGoogle Scholar
  13. [13]
    Federal Aviation Administration. System Design and Analysis, June 21, 1988. Advisory Circular 25.1309–1A.Google Scholar
  14. [14]
    Federal Aviation Administration Technical Center. Digital Systems Validation Handbook—Volume III. Atlantic City, NJ. Forthcoming.Google Scholar
  15. [15]
    Fielder J H, and Birsch D, Eds. The DC-10 Case: A Case Study in Applied Ethics, Technology, and Society. State University of New York Press, 1992.Google Scholar
  16. [16]
    Gerhart S L, and Yelowitz L. Observations of fallibility in modern programming methodologies. IEEE Transactions on Software Engineering SE-2, 3 (Sept. 1976), 195–207.MathSciNetCrossRefGoogle Scholar
  17. [17]
    Guiho G, and Hennebert C. SACEM software validation. In 12th International Conference on Software Engineering (Nice, France, Mar. 1990), IEEE Computer Society, pp. 186–191.Google Scholar
  18. [18]
    Har’El Z, and Kurshan R P. Software for analytical development of communications protocols. AT&T Technical Journal 69, 1 (Jan./Feb. 1990), 45–59.Google Scholar
  19. [19]
    Hayes I J, and Jones C B. Specifications are not (necessarily) executable. IEE/B CS Software Engineering Journal 4, 6 (Nov. 1989), 320–338.Google Scholar
  20. [20]
    Hecht H. Rare conditions: An important cause of failures. In COMPASS ‘83 (Proceedings of the Eighth Annual Conference on Computer Assurance) (Gaithersburg, MD, June 1993), IEEE Washington Section, pp. 81–85.Google Scholar
  21. [21]
    Jones C B. Systematic Software Development Using VDM, second ed. Prentice Hall International Series in Computer Science. Prentice Hall, Hemel Hempstead, UK, 1990.MATHGoogle Scholar
  22. [22]
    Joyce J. Multi-Level Verification of Microprocessor-Based Systems. PhD thesis, University of Cambridge, Dec. 1989.Google Scholar
  23. [23]
    Kasuda R, and Packard D S. Spacecraft fault tolerance: The Magellan experience. In Proceedings of the Annual Rocky Mountain Guidance and Control Conference (Keystone, CO, Feb. 1993), R. D. Culp and G. Bickley, Eds., vol. 81 of Advances in the Astronautical Sciences, American Astronautical Society, pp. 249–267.Google Scholar
  24. [24]
    Kelly J C, Sherif J S, and Hops J. An analysis of defect densities found during software inspections. Journal of Systems Software 17 (1992), 111–117.CrossRefGoogle Scholar
  25. [25]
    Keutzer K. The need for formal verification in hardware design and what formal verification has not done for me lately. In Proceedings of the 1991 International Workshop on the HOL Theorem Proving System and its Applications (Davis, CA, Aug. 1991), P. Windley, Ed., IEEE Computer Society, pp. 77–86.Google Scholar
  26. [26]
    Knight J C, and Leveson N G. An experimental evaluation of the assumption of independence in multiversion programming. IEEE Transactions on Software Engineering SE-12,1 (Jan. 1986), 96–109.Google Scholar
  27. [27]
    Lamport L, and Melliar-Smith P M. Synchronizing clocks in the presence of faults. J. ACM 32, 1 (Jan. 1985), 52–78.MathSciNetMATHCrossRefGoogle Scholar
  28. [28]
    Lamport L, and Merz S. Specifying and verifying fault-tolerant systems. In Formal Techniques in Real-Time and Fault-Tolerant Systems (Lübeck, Germany, Sept. 1994), H. Langmaack, W.-P. de Roever, and J. Vytopil, Eds., vol. 863 of Lecture Notes in Computer Science, Springer-Verlag, pp. 41–76.Google Scholar
  29. [29]
    Lamport L, Shostak R, and Pease M. The Byzantine Generals problem. ACM Trans. Program. Lang. Syst. 4, 3 (July 1982), 382–401.MATHCrossRefGoogle Scholar
  30. [30]
    Leveson N G. Safeware: System Safety and Computers. Addison-Wesley, 1995.Google Scholar
  31. [31]
    Leveson N G, and Turner C S. An investigation of the Therac-25 accidents. IEEE Computer 26, 7 (July 1993), 18–41.Google Scholar
  32. [32]
    Lincoln P, and Rushby J. A formally verified algorithm for interactive consistency under a hybrid fault model. In Fault Tolerant Computing Symposium 23 (Toulouse, France, June 1993), IEEE Computer Society, pp. 402–411.Google Scholar
  33. [33]
    Lloyd E, and Tye W. Systematic Safety: Safety Assessment of Aircraft Systems. Civil Aviation Authority, London, England, 1982. Reprinted 1992.Google Scholar
  34. [34]
    Lutz RR. Analyzing software requirements errors in safety-critical embedded systems. In IEEE International Symposium on Requirements Engineering (San Diego, CA, Jan. 1993), pp. 126–133.Google Scholar
  35. [35]
    Mackall D A. Development and flight test experiences with a flight-crucial digital control system. NASA Technical Paper 2857, NASA Ames Research Center, Dryden Flight Research Facility, Edwards, CA, 1988.Google Scholar
  36. [36]
    Mellor P. CAD: Computer-aided disaster. High Integrity Systems 1, 2 (1994), 101–156.MathSciNetGoogle Scholar
  37. [37]
    Miller S P, and Srivas M. Formal verification of the AAMP5 microprocessor: A case study in the industrial use of formal methods. In WIFT ‘85: Workshop on Industrial-Strength Formal Specification Techniques (Boca Raton, FL, 1995), IEEE Computer Society, pp. 2–16.Google Scholar
  38. [38]
    Owre S, Rushby J, Shankar N, and von Henke F. Formal verification for fault-tolerant architectures: Prolegomena to the design of PVS. IEEE Transactions on Software Engineering 21, 2 (Feb. 1995), 107–125.CrossRefGoogle Scholar
  39. [39]
    Parnas D L, and Weiss D M. Active design reviews: Principles and practices. In 8th International Conference on Software Engineering (London, UK, Aug. 1985), IEEE Computer Society, pp. 132–136.Google Scholar
  40. [40]
    Requirements and Technical Concepts for Aviation. DO-178B: Software Considerations in Airborne Systems and Equipment Certification. Washington, DC, Dec. 1992. This document is known as EUROCAE ED-12B in Europe.Google Scholar
  41. [41]
    Rumbaugh J, Blaha M, Premerlani W, Eddy F, and Lorensen W. Object-Oriented Modeling and Design. Prentice Hall, Englewood Cliffs, NJ, 1991.Google Scholar
  42. [42]
    Rushby J. Formal methods and the certification of critical systems. Tech. Rep. SRI-CSL-93–7, Computer Science Laboratory, SRI International, Menlo Park, CA, Dec. 1993. Also issued under the title Formal Methods and Digital Systems Validation for Airborne Systems as NASA Contractor Report 4551, December 1993.Google Scholar
  43. [43]
    Rushby J, and von Henke F. Formal verification of algorithms for critical systems. IEEE Transactions on Software Engineering 19, 1 (Jan. 1993), 13–23.CrossRefGoogle Scholar
  44. [44]
    Spivey J M, Ed. The Z Notation: A Reference Manual, second ed. Prentice Hall International Series in Computer Science. Prentice Hall, Hemel Hempstead, UK, 1993.Google Scholar
  45. [45]
    UK Ministry of Defence. Interim Defence Standard 00–55: The procurement of safety critical software in defence equipment, Apr. 1991. Part 1, Issue 1: Requirements; Part 2, Issue 1: Guidance.Google Scholar
  46. [46]
    Vincenti W G. What Engineers Know and How They Know It: Analytical Studies from Aeronatical History. Johns Hopkins Studies in the History of Technology. The Johns Hopkins University Press, Baltimore, MD, 1990.Google Scholar

Copyright information

© Springer-Verlag London Limited 1997

Authors and Affiliations

  • John Rushby
    • 1
  1. 1.Computer Science LaboratorySRI InternationalMenlo ParkUSA

Personalised recommendations