Software Reliability of Complex Systems Focus for Intelligent Vehicles

  • György SchusterEmail author
  • Daniel Tokody
  • Imre János Mezei
Conference paper
Part of the Lecture Notes in Mechanical Engineering book series (LNME)


Using software became a part of our everyday life, in the last few decades. Software is widely used in areas, such as national defence, aeronautics and astronautics, medicine or even transport. There are 100 million lines of codes in a modern high-end car’s engine control unit. In comparison, the Space Shuttle needs 400 000, the F22 fighter jet needs less than 2 million, the Boeing 787 airplane needs 14 million and the Facebook needs more than 60 million lines of codes to function. Even a smaller error can lead to devastating consequences in safety-critical systems, such as those operating in vehicles. There have been several examples in recent years, when an automotive recall was necessary due to dangerous software, and there were cases when these errors presumably caused fatal accidents. Definition of software reliability is the error-free working probability of software for a specified period of time under well-defined environment. Usage of software is inevitable. It can be found in every vehicle to control almost everything. Therefore software can be considered as a critical success factor and it has a strong effect on the reliability of the whole system. The software systems are getting more and more complex. Known fact is a more complex system has more possibility to have errors. The most difficult problem is that the traditional methods of reliability cannot be used. For example fatigue and wearing of mechanical parts or features of lubricant systems can be calculated quite well, since we have enough prior knowledge on their features. Unfortunately, in case of software systems this knowledge is missing. This paper deals with the question of software reliability. In the first part it lists the problems and the second part gives some mathematical issues to calculate working probability.


Software reliability Complex systems Safety critical system Intelligent vehicles 


  1. 1.
    Flottau J, Osborne T (2015) Software cut off fuel supply in stricken A400M. Accessed 19 May 2015
  2. 2.
    The Aviation Safety Network, Accident description. Accessed 19 May 2015
  3. 3.
    Mouawad J, FAA (2015) Orders fix for possible power loss in boeing 787. Accessed 19 May 2015
  4. 4.
    Lions JL (1996) ARIANE 5 Flight 501 Failure. Accessed 19 July 2015
  5. 5.
  6. 6.
    Schneidewind FN (1997) Reliability modeling for safety critical software. IEEE Trans Reliab 46(1)Google Scholar
  7. 7.
    Van Solingen R, Berghout E (1999) The goal/question/metric method: a practical guide for quality improvement and software development. McGraw-Hill InternationalGoogle Scholar
  8. 8.
    Long J (ed) (2008) Metrics data program, national aeronautics and space administration.
  9. 9.
    Tokody D, Schuster G, Papp J (2015) Study of how to implement an intelligent railway system in Hungary. In: 2015 IEEE 13th international symposium on intelligent systems and informatics (SISY), Subotica, pp 199–204. doi: 10.1109/SISY.2015.7325379
  10. 10.
    Markov process (mathematics)—Britannica Online EncyclopediaGoogle Scholar
  11. 11.
    Marszal EM, Scharpf EW (2002) Safety integrity level selection—systematic methods including layer of protection analysis. The Instrumentation, Systems, and Automation Society, Research Triangle Park, NC, USA. ISBN: 9781556177774Google Scholar
  12. 12.
    Szilágyi GyA (2016) The fractal dimension of the air accidents. Repüléstudományi Közlemények XXVIII(2):41–48. HU ISSN 1417-0604Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • György Schuster
    • 1
    Email author
  • Daniel Tokody
    • 1
  • Imre János Mezei
    • 1
  1. 1.Doctoral School on Safety and Security SciencesÓbuda UniversityBudapestHungary

Personalised recommendations