Skip to main content
Log in

Evaluation of software testing metrics for NASA's Mission Control Center

  • Papers
  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

Software metrics are used to evaluate the software developmentprocess and the quality of the resultingproduct. We used five metrics during the testing phase of the Shuttle Mission Control Center (MCC) Upgrade (MCCU) at the National Aeronautics and Space Administration's (NASA) Johnson Space Center. All but one metric provided useful information. Based on our experience we recommend using metrics during the test phase of software development and propose additional candidate metrics for further study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Auerbach, H. (1989) Logiscope automated source code analyzer, Technical Presentation to NASA, Verilog Corp., December.

  • Catalytix (1983)The Safe C Runtime Analyzer, Catalytix Corp., Cambridge, MA.

    Google Scholar 

  • Christenson, D. (1988) Using software reliability models to predict field failure rates in electronic switching systems, inProceedings of the Fourth Annual National Conference on Software Quality and Productivity, Washington, D.C., 1–3 March, pp. 163–173.

  • Ejzak, R.P. (1987) On the successful application of software reliability modeling, inProceedings of the Fall Joint Computer Conference Dallas, TX, October 26–29, pp. 119.

  • Farr, W.H. (1983)A Survey of Software Reliability Modeling and Estimation, NSWC-TR-82-171, Naval Surface Weapons Center, Dahlgren, VA, September.

  • Kearney III, M.W., (1987) The evolution of the Mission Control Center,Proceedings of the IEEE,75, 399–416.

    Google Scholar 

  • Levendel, Y. (1990) Reliability analysis of large software systems: defect data modeling,IEEE Transactions on Software Engineering,16, 141–52.

    Google Scholar 

  • Lyu, M.R. (1989) Applying software reliability models: validity, predictive ability, usefulness, presented at the AIAA SBOS/COS Software Reliability Workshop, Houston, TX, December.

  • McCabe, T.J. (1983)Structured Testing, Catalog No. EHO 200-6i, New York, IEEE Computer Society Press

    Google Scholar 

  • Musa, J.D. and Okumoto, K. (1982) Software reliability models: concepts, classification, comparisons, and practice, presented at the NATO Advanced Study Institute, Norwich, U.K., July.

  • Musa, J.D. Iannino, A. and Okumoto, K. (1987)Software Reliability: Measurement, Prediction, Application, McGraw-Hill, New York.

    Google Scholar 

  • Myers, G.J. (1976)Software Reliability: Principles and Practices, John Wiley and Sons, New York.

    Google Scholar 

  • NASA, MCC STS and JSC POCC Mature OPS Timeframe Level A Requirements, JSC-12804, Aug. 1985, pp. III.B-100-III.B-108.

  • Schultz, H.P. (1988)Software Management Metrics, M88-1, The MITRE Corporation, Bedford, MA.

    Google Scholar 

  • SET (1988)UX-METRIC User's Guide, SET Laboratories Inc., P.O. Box 868 Mulino, OR, 97042

    Google Scholar 

  • Softool (1989)FORTRAN Testing Instrumenters User's Guide, Softool Corp., Goleta, CA.

    Google Scholar 

  • Software Research (1986)S-Tcat/C User's Guide, Software Research Corp., San Francisco, CA

    Google Scholar 

  • Stark, G.E. (1990) Software reliability for flight crew training simulators, inAIAA Flight Simulation Technologies Conference and Exhibit, Dayton, OH, September, pp. 22–26.

  • Stark, G.E. (1991) A survey of software reliability measurement tools, inProceedings of the International Symposium on Software Reliability Engineering, Austin, TX, 18–19 May, pp. 90–97.

  • Stark, G.E. and Shooman, M.L. (1986) A comparison of software reliability models based on field data, presented at ORSA/TIMS Joint National Meeting, Miami, FL., October.

  • TORCH (1984)TORCH/PMS User's Guide, Datametrics Systems Corp., Fairfax, VA.

    Google Scholar 

  • VAX (1982)VAX Performance and Coverage Analyzer, Digital Equipment Corp., Nashua, NH

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Stark, G.E., Durst, R.C. & Pelnik, T.M. Evaluation of software testing metrics for NASA's Mission Control Center. Software Qual J 1, 115–132 (1992). https://doi.org/10.1007/BF01845743

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01845743

Keywords

Navigation