• Tim A. Majchrzak
Part of the SpringerBriefs in Information Systems book series (BRIEFSINORMAT)


In the following sections, this book’s topic is introduced, objectives and research questions are discussed, and the course of action is sketched. First of all, the setting for software testing is described. Since the advent of computers, the available computational power and the capacity of memory and storage systems have grown exponentially. At the same time, computer systems decreased in size and their prices plummeted. But it was software that made the vast computational power operational. Software enables humans to take advantage of general purpose computers for distinct applications in business, science, communications, specialized fields such as healthcare, and entertainment.


Software Testing Software Development Project Failed Project Open Research Question General Purpose Computer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Moore, G.E.: Cramming more components onto integrated circuits. Electronics 38(8), 114–117 (1965)Google Scholar
  2. 2.
    Schaller, R.R.: Moore’s law: past, present, and future. IEEE Spectr. 34(6), 52–59 (1997)CrossRefGoogle Scholar
  3. 3.
    Naur, P., Randell, B.: Software Engineering: Report of a Conference Sponsored by the NATO Science Committee, Garmisch, Germany. Scientific Affairs Division, NATO (1969)Google Scholar
  4. 4.
    Dijkstra, E.W.: The humble programmer. Commun. ACM 15, 859–866 (1972)CrossRefGoogle Scholar
  5. 5.
    Held, J.: Foreword. Intel Technol. J. 13(4) (2009)Google Scholar
  6. 6.
    Held, J., Bautista, J., Koehl, S.: From a Few Cores to Many: a Tera-scale Computing Research Overview. White paper, Intel (2006)Google Scholar
  7. 7.
    Ciechanowicz, P., Kuchen, H.: Enhancing Muesli’s data parallel skeletons for multi-core computer architectures. In: 12th IEEE International Conference on High Performance Computing and Communications (HPCC), pp. 108–113. IEEE, Washington, DC (2010)Google Scholar
  8. 8.
    Fuller, S.H., Millett, L.I.: Computing performance: game over or next level? Computer 44(1), 31–38 (2011)Google Scholar
  9. 9.
    Charette, R.N.: Why software fails. IEEE Spectr. 42(9), 42–49 (2005)CrossRefGoogle Scholar
  10. 10.
    Flowers, S.: Software Failure: Management Failure: Amazing Stories and Cautionary Tales. Wiley, New York (1996)Google Scholar
  11. 11.
    Glass, R.L.: In the Beginning: Recollections of Software Pioneers. IEEE Computer Society Press, Los Alamitos (1997)Google Scholar
  12. 12.
    Glass, R.L.: Computing Calamities: Lessons Learned From Products, Projects, and Companies That Failed. Prentice Hall, Upper Saddle River (1999)Google Scholar
  13. 13.
    Glass, R.L.: War Stories from the Electronic Revolution. Prentice Hall, Upper Saddle River (2001)Google Scholar
  14. 14.
    Thaller, G.E.: Software-Test, 2nd edn. Heise, Hannover (2002)Google Scholar
  15. 15.
    Glass, R.L.: Software Runaways: Lessons Learned from Massive Software Project Failures. Prentice Hall, Upper Saddle River (1998)Google Scholar
  16. 16.
    Wallmüller, E.: Risikomanagement für IT—und Software-Projekte. Hanser Fachbuchverlag, München (2004)CrossRefGoogle Scholar
  17. 17.
    Page, D., Williams, P., Boyd, D.: Report of the Inquiry into the London Ambulance Service. South West Thames Regional Health Authority, London (1993)Google Scholar
  18. 18.
    Beynon-Davies, P.: Information systems ‘failure’: the case of the London ambulance service’s computer aided despatch project. Eur. J. Inf. Syst. 4(3), 171–184 (1995)CrossRefGoogle Scholar
  19. 19.
    Landry, J.R.: Analyzing the London ambulance service’s computer aided despatch (LASCAD) failure as a case of administrative evil. In: Proceedings of the Special Interest Group on Management information system’s 47th Annual Conference on Computer Personnel Research, SIGMISCPR’09, pp. 167–174. ACM, New York (2009)Google Scholar
  20. 20.
    Fitzgerald, G., Russo, N.L.: The turnaround of the London ambulance service computer-aided despatch system (LASCAD). Eur. J. Inf. Syst. 14(3), 244–257 (2005)CrossRefGoogle Scholar
  21. 21.
    Finkelstein, A., Dowell, J.: A comedy of errors: the London ambulance service case study. In: Proceedings of the Eighth International Workshop on Software Specification and Design, pp. 2–4. IEEE Computer Society, Washington, DC (1996)Google Scholar
  22. 22.
    Mayr, H.: Projekt Engineering: Ingenieurmäßige Softwareentwicklung in Projektgruppen, 2nd edn. Hanser Fachbuchverlag, München (2005)Google Scholar
  23. 23.
    Glass, R.L.: The Standish report: does it really describe a software crisis? Commun. ACM 49(8), 15–16 (2006)Google Scholar
  24. 24.
    Jørgensen, M., Moløkken-Østvold, K.: How large are software costoverruns? a reviewof the 1994 CHAOS report. Inf. Softw. Technol. 48(4), 297–301 (2006)Google Scholar
  25. 25.
    Laurenz Eveleens, J., Verhoef, C.: The rise and fall of the chaos report figures. IEEE Softw. 27(1), 30–36 (2010)CrossRefGoogle Scholar
  26. 26.
    Glass, R.L.: IT failure rates—70% or 10–15%? IEEE Softw. 22(3), 111–112 (2005)Google Scholar
  27. 27.
    Brereton, P., Kitchenham, B.A., Budgen, D., Turner, M., Khalil, M.: Lessons from applying the systematic literature review process within the software engineering domain. J. Syst. Softw. 80(4), 571–583 (2007)CrossRefGoogle Scholar
  28. 28.
    Balzert, H.: Lehrbuch der Softwaretechnik: Softwaremanagement, 2nd edn. Spektrum Akademischer Verlag, Heidelberg (2008)Google Scholar
  29. 29.
    Li, S., Shang, J., Slaughter, S.A.: Why do software firms fail? capabilities, competitive actions, and firm survival in the software industry from 1995 to 2007. Inf. Syst. Res. 21(3), 631–654 (2010)Google Scholar
  30. 30.
    Ariane 5 Inquiry Board: ARIANE 5—Flight 501 Failure. Ariane 5 Inquiry Board (1997).
  31. 31.
    NASA: Mars Climate Orbiter Mishap Investigation Board Phase I Report (1999)Google Scholar
  32. 32.
  33. 33.
  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
    Nuseibeh, B.: Ariane 5: who dunnit? IEEE Softw. 14(3), 15–16 (1997)Google Scholar
  40. 40.
    Le Lann, G.: An analysis of the Ariane 5 flight 501 failure—a system engineering perspective. In: Proceedings Conference Engineering of Computer-Based Systems, pp. 339–346 (1997)Google Scholar
  41. 41.
    Oberg, J.: Why the Mars probe went off course. IEEE Spectr. 36(12), 34–39 (1999)CrossRefGoogle Scholar
  42. 42.
    Euler, E., Jolly, S., Curtis, H.: The failures of the Mars climate orbiter and Mars polar lander: a perspective from the people involved. In: Proceedings of Guidance and Control, vol. AAS 01-074. American Astronautical Society, Escondido (2001)Google Scholar
  43. 43.
    Lacan, P., Monfort, J.N., Ribal, L.V.Q., Deutsch, A., Gonthier, G.: ARIANE 5—The software reliability verification process. In: Kaldeich-Schürmann, B. (ed.) Proceedings of the Data Systems in Aerospace (DASIA), pp. 201–205. European Space Agency, Paris (1998)Google Scholar
  44. 44.
    Kopec, D., Tamang, S.: Failures in complex systems: case studies, causes, and possible remedies. SIGCSE Bull. 39(2), 180–184 (2007)CrossRefGoogle Scholar
  45. 45.
    Jones, C.: Software Quality: Analysis and Guidelines for Success. Thomson Learning, Farmington Hills (1997)Google Scholar
  46. 46.
    Sneed, H.M., Winter, M.: objektorientierter Software. Hanser, München (2002)Google Scholar
  47. 47.
    Hetzel, W. (ed.): Program Test Methods. Prentice Hall, Englewood Cliffs (1973)Google Scholar
  48. 48.
    Gelperin, D., Hetzel, B.: The growth of software testing. Commun. ACM 31(6), 687–695 (1988)CrossRefGoogle Scholar
  49. 49.
    Brooks, F.P., Jr.: No silver bullet—essence and accident in software engineering. In: Proceedings of the IFIP Tenth World Computing Conference, pp. 1069–1076 (1986)Google Scholar
  50. 50.
    Wirth, N.: A plea for lean software. Computer 28(2), 64–68 (1995). Google Scholar
  51. 51.
    Majchrzak, T.A.: Best practices for the organizational implementation of software testing. In: Proceedings of the 43rd Annual Hawaii International Conference on System Sciences (HICSS-43), pp. 1–10. IEEE Computer Society, Washington, DC (2010)Google Scholar
  52. 52.
    Gotterbarn, D., Miller, K.W.: The public is the priority: making decisions using the software engineering code of ethics. Computer 42(6), 58–65 (2009)CrossRefGoogle Scholar
  53. 53.
    Neumann, P.G.: Risks to the public in computer systems. SIGSOFT Softw. Eng. Notes 11(1), 2–14 (1986)CrossRefGoogle Scholar
  54. 54.
    Neumann, P.G.: Illustrative risks to the public in the use of computer systems and related technology. SIGSOFT Softw. Eng. Notes 21(1), 16–30 (1996)CrossRefGoogle Scholar
  55. 55.
    Neumann, P.G.: Risks to the public in computers and related systems. SIGSOFT Softw. Eng. Notes 24(4), 26–29 (1999)CrossRefGoogle Scholar
  56. 56.
    Neumann, P.G.: Risks to the public. SIGSOFT Softw. Eng. Notes 35(3), 24–32 (2010)CrossRefGoogle Scholar
  57. 57.
    Tamai, T.: Social impact of information system failures. Computer 42(6), 58–65 (2009)CrossRefGoogle Scholar
  58. 58.
    Gotterbarn, D., Miller, K.W.: The public is the priority: making decisions using the software engineering code of ethics. Computer 42(6), 66–73 (2009).
  59. 59.
    Keil, M., Cule, P.E., Lyytinen, K., Schmidt, R.C.: A framework for identifying software project risks. Commun. ACM 41(11), 76–83 (1998)CrossRefGoogle Scholar
  60. 60.
    Keil, M., Robey, D.: Turning around troubled software projects: an exploratory study of the deescalation of commitment to failing courses of action. J. Manag. Inf. Syst. 15(4), 63–87 (1999)Google Scholar
  61. 61.
    Majchrzak, T.A., Kuchen, H.: IHK-projekt softwaretests: Auswertung. In: Working Papers, vol.2. Förderkreis der Angewandten Informatik an der Westfälischen Wilhelms-Universität, Münster e.V. (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg  2012

Authors and Affiliations

  1. 1.Institut für WirtschaftsinformatikWestfälische Wilhelms-UniversitätMünsterGermany

Personalised recommendations