Skip to main content
Log in

An automated software reliability prediction system for safety critical software

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Software reliability is one of the most important software quality indicators. It is concerned with the probability that the software can execute without any unintended behavior in a given environment. In previous research we developed the Reliability Prediction System (RePS) methodology to predict the reliability of safety critical software such as those used in the nuclear industry. A RePS methodology relates the software engineering measures to software reliability using various models, and it was found that RePS’s using Extended Finite State Machine (EFSM) models and fault data collected through various software engineering measures possess the most satisfying prediction capability. In this research the EFSM-based RePS methodology is improved and implemented into a tool called Automated Reliability Prediction System (ARPS). The features of the ARPS tool are introduced with a simple case study. An experiment using human subjects was also conducted to evaluate the usability of the tool, and the results demonstrate that the ARPS tool can indeed help the analyst apply the EFSM-based RePS methodology with less number of errors and lower error criticality.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31

Similar content being viewed by others

References

  • IEEE (1990) IEEE Standard Glossary of Software Engineering Terminology, IEEE Std.610.12-1990. IEEE, New York

    Google Scholar 

  • ISO/IEC (2001) ISO/IEC 9126-1: 2001, Software Engineering – Product Quality – Part 1: Quality model

  • Musa J (1975) A theory of software reliability and its application. IEEE Trans Softw Eng 1(3):312–327

    Article  Google Scholar 

  • Huang C (2005) Performance analysis of software reliability growth models with testing-effort and change-point. J Syst Softw 76(2):181–194

    Article  Google Scholar 

  • Huang C, Kuo S, et al. (2007) An assessment of testing-effort dependent software reliability growth models. IEEE Trans Reliab 56(2):198–211

    Article  Google Scholar 

  • Mills H (1972) On the statistical validation of computer programs. IBM Federal Systems Division Report:72–6015

  • Walia G, Carver J (2008) The Effect of the Number of Defects on Estimates Produced by Capture-Recapture Models. In: Software Reliability Engineering, 2008. ISSRE 2008. 19th International Symposium on, pp. 305-306

  • Li M, Smidts C (2003) A ranking of software engineering measures based on expert opinion, vol 29, pp. 24–811

  • Pham H (2007) System software reliability. Springer

  • Smidts C, Huang F, et al. (2015) A Method for Quantifying the Dependability Attributes of Software-Based Safety Critical Instrumentation Control Systems in Nuclear Power Plants. In: Proc. NPIC-HMIT 2015

  • Huang F, Liu B (2013) Study on the correlations between program metrics and defect rate by a controlled experiment. Int J Softw Eng 7(3):114–120

    Article  Google Scholar 

  • Huang F, Liu B, et al. (2015) The impact of software process consistency on residual defects. Journal of Software Evolution and Process

  • Smidts C, Li M (2004) Validation of A Methodology for Assessing Software Quality. NUREG/CR-6848, Office of Nuclear Regulatory Research, Washington DC

  • Smidts C, Li M, et al. (2010) A Large Scale Validation of a Methodology for Assessing Software Reliability. NUREG/CR-7042, Office of Nuclear Regulatory Research, Washington DC

  • Li X, Gupta J, et al. (2013) ARPS: An Automated Reliability Prediction System Tool for Safety Critical Software, PSA 2013, Columbia, South Carolina, September 22-27

  • Wang CJ, Liu MT (1993) Generating Test Cases for EFSM with Given Fault Models. In: Proceedings of 12th IEEE Computer and Communications Societies

  • Voas J (1992) PIE: A Dynamic Failure-Based Technique. IEEE Trans Softw Eng 18(8)

  • Lyu M (1996) Handbook of software reliability engineering. Vol. 222. IEEE computer society press, CA

  • Smidts C, Li B, et al. (2002) Software Reliability Models, vol 2, 2nd ed. Wiley, New York, pp. 1594–1610

  • Pandey A, Goyal N (2013) Early Software Reliability Prediction. Springer

  • Cheung L, Roshandel R, et al. (2008) Early prediction of software component reliability. In: Proceedings of the 30th international conference on Software engineering, pp. 111–120

  • Gaffney G, Pietrolewiez J (1990) An automated model for software early error prediction (SWEEP). In: Proceeding of 13th Minnow Brook Workshop on Software Reliability

  • Fenton N, Neil M (1999) A critique of software defect prediction models. IEEE Trans Softw Eng 25(5):675–689

    Article  Google Scholar 

  • Langseth H, Portinale L (2007) Bayesian networks in reliability. Reliab Eng Syst Saf 92(1):92–108

    Article  Google Scholar 

  • Gokhale S, Trivedi K (2006) Analytical models for architecture-based software reliability prediction: a unification framework. IEEE Trans Reliab 55(4):578–590

    Article  Google Scholar 

  • Lyu M, Nikora A (1992) CASRE: a computer-aided software reliability estimation tool. In: Computer-Aided Software Engineering, 1992. Proceedings., Fifth International Workshop on, pp. 264-275

  • Ramani S, Gokhale S, et al. (2000) SREPT: software reliability estimation and prediction tool. Perform Eval 39(1):37–60

    Article  MATH  Google Scholar 

  • Chen C, Lin C, et al. (2006) CARATS: a computer-aided reliability assessment tool for software based on object-oriented design. In: TENCON 2006. 2006 IEEE Region 10 Conference, pp. 1-4

  • Wang W, Scannell D (2005) An architecture-based software reliability modeling tool and its support for teaching. In: Frontiers in Education, 2005. FIE’05. Proceedings 35th Annual Conference, pp. T4C-T4C

  • Boudali H, Dugan J (2006) A continuous-time Bayesian network reliability modeling, and analysis framework. IEEE Trans Reliab 55(1):86–97

    Article  Google Scholar 

  • IEEE Computer Society (1998) Software Engineering Standards Committee, and IEEE-SA Standards Board. IEEE Recommended Practice for Software Requirements Specifications. IEEE Std 830 -1998, Institute of Electrical and Electronics Engineers

  • Musa J, Lannino A, et al. (1987) Software Reliability-Measurement, Prediction, Applications. McGraw-Hill, New York

  • Musa J (1993) Operational profiles in software-reliability engineering. Software, IEEE 10(2):14–32

    Article  Google Scholar 

  • Lam M, Sethi R, et al. (2006) Compilers: Principles, Techniques, and Tools

  • Wolfram (2014) Equation solving. http://reference.wolfram.com/language/guide/EquationSolving.html. [Retrieved: 2014-10-14]

  • Mathworks (2014) Solve equations and inequalities. http://www.mathworks.com/help/symbolic/mupad_ref/solve.html. [Retrieved: 2014-10-14]

  • IEEE Computer Society (1998). Software & System Engineering Standards Committee, IEEE Standard for Information Technology—Systems Design—Software Design Descriptions IEEE Std 1016-1998, Institute of Electrical and Electronics Engineers

  • Booch G, Rumbaugh J, et al. (2005) Unified Modeling Language User Guide, the 2nd Edition. Addison-Wesley

  • Huang F, Liu B, et al. (2014) The links between human error diversity and software diversity: Implications for fault diversity seeking. Science of Computer Programming 89(Part C):350–373

    Article  Google Scholar 

  • Chambers J, Cleveland W, et al. (1983) Graphical Methods for Data Analysis. Wadsworth

  • Siegel S (1956) Non-parametric statistics for the behavioral sciences, New York: McGraw-Hill, pp. 75–83

  • Mendenhall W, Wackerly D, et al. (1989) 15: Nonparametric statistics, Fourth ed. PWS-Kent, pp. 674–679

  • Dixon W (1953) Power functions of the sign test and power efficiency for normal alternatives. Ann Math Stat:467–473

  • Li B, Li M, et al. (2005) Integrating software into PRA: A software-related failure mode taxonomy. Risk Anal 26(4)

Download references

Acknowledgments

This paper was prepared as an account of work sponsored by an agency of the U.S. Government. Neither the U.S. Government nor any agency thereof, nor any of their employees, makes any warranty, expressed or implied, or assumes any legal liability or responsibility for any third party’s use, or the results of such use, of any information, apparatus, product, or process disclosed in this report, or represents that its use by such third party would not infringe privately owned rights. The views expressed in this paper are not necessarily those of the U.S. Nuclear Regulatory Commission. We are grateful to Kevin Smearsoll and Boyuan Li for supporting this research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiang Li.

Additional information

Communicated by: Nachiappan Nagappan

Appendix

Appendix

Appendix. List of Acronyms Used in this Paper

A:

Solve a test problem using ARPS tool

ARPS:

Automated Reliability Prediction System

BBN:

Bayesian Belief Networks

EFSM:

Extended Finite State Machine

Ei :

The Execution probability of the i-th defect

EI :

Error index

F:

False

H:

Null hypothesis

HA :

Alternative hypothesis

HLEFSM:

High Level Extended Finite State Machine

IAP:

Incorrect/Ambiguous Predicate

Ii :

The Infection probability of the i-th defect

INP:

Information Not Post-processed

IP:

Information Post-processed

IV:

Internal Variables

LLEFSM:

Low Level Extended Finite State Machine

M:

Manually solve a test problem

NE :

Number of errors

OP:

Operational Profile

P:

Set of Predicates; Pressure

PC :

The correct predicate

pdf:

Probability density function

Pi :

The Propagation probability of the i-th defect

PIE:

Propagation, Infection and Execution analysis

pos:

Position

PO :

The original predicate

Pres:

Pressure

prev:

Previous

Prob:

Probability

Re:

Reliability

RePS:

Reliability Prediction System

S:

Set of States

Sat :

Satisfactory

SDD:

Software Design Document

SI:

State Initialized

SNI:

State Not Initialized

SRGM:

Software Reliability Growth Model

SRS:

Software Requirement Specification

SUS’s:

Software under study’s

T:

Set of Transactions; Temperature; True

T :

Time

T1:

Test #1

T2:

Test #2

Temp:

Temperature

V1_C:

Valve #1 closed

V1_O:

Valve #1 opened

V2_C:

Valve #2 closed

V2_O:

Valve #2 opened

V3_C:

Valve #3 closed

V3_O:

Valve #3 opened

VCS:

Valve Control System

Γ:

Output Variables

Σ:

Input Variables

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X., Mutha, C. & Smidts, C.S. An automated software reliability prediction system for safety critical software. Empir Software Eng 21, 2413–2455 (2016). https://doi.org/10.1007/s10664-015-9412-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-015-9412-6

Keywords

Navigation