Reproducibility of Software Bugs

Basic Concepts and Automatic Classification
  • Flavio Frattini
  • Roberto Pietrantuono
  • Stefano Russo
Chapter

Abstract

Understanding software bugs and their effects is important in several engineering activities, including testing, debugging, and design of fault containment or tolerance methods. Dealing with hard-to-reproduce failures requires a deep comprehension of the mechanisms leading from bug activation to software failure. This chapter surveys taxonomies and recent studies about bugs from the perspective of their reproducibility, providing insights into the process of bug manifestation and the factors influencing it. These insights are based on the analysis of thousands of bug reports of a widely used open-source software, namely MySQL Server. Bug reports are automatically classified according to reproducibility characteristics, providing figures about the proportion of hard to reproduce bug their features, and evolution over releases.

References

  1. 1.
    Carrozza G, Pietrantuono R, Russo S (2014) Defect analysis in mission-critical software systems: a detailed investigation. J Softw Evol Process 27(1):22, 49Google Scholar
  2. 2.
    Grady RB (1992) Practical software metrics for project management and process improvement. Prentice Hall, Englewood CliffsGoogle Scholar
  3. 3.
    IEEE Computer Society IEEE Standard Classification for Software Anomalies, IEEE Std 1044–2009Google Scholar
  4. 4.
    Chillarege R, Bhandari IS, Chaar JK, Halliday MJ, Moebus DS, Ray BK, Wong M-Y (1992) Orthogonal defect classification-a concept for in-process measurements. IEEE Trans Softw Eng 18(11):943–956CrossRefGoogle Scholar
  5. 5.
    Avizienis A, Laprie J-C, Randell B, Landwehr C (2004) Basic concepts and taxonomy of dependable and secure computing. IEEE Trans Dependable Secure Comput 1(1):11–33CrossRefGoogle Scholar
  6. 6.
    Gait J (1986) A probe effect in concurrent programs. Softw Pract Exp 16(3):225, 233Google Scholar
  7. 7.
    Gray J (1985) Why do computers stop and What can be done about it? Tandem Tech Report TR-85.7Google Scholar
  8. 8.
    Grottke M, Trivedi KS (2005) A classification of software faults. In: Supplemental proceedings 16th IEEE international symposium on software reliability engineering (ISSRE), pp 4.19–4.20Google Scholar
  9. 9.
    Grottke M, Trivedi KS (2007) Fighting bugs: remove, retry, replicate, and rejuvenate. Computer 40(2):107–109CrossRefGoogle Scholar
  10. 10.
    Grottke M, Nikora A, Trivedi KS (2010) An empirical investigation of fault types in space mission system software. In: Proceedings IEEE/IFIP international conference on dependable systems and networks (DSN), pp 447–456Google Scholar
  11. 11.
    Chillarege R (2011) Understanding Bohr-Mandel bugs through ODC triggers and a case study with empirical estimations of their field proportion. In: Proceedings 3rd IEEE international workshop on software aging and rejuvenation (WoSAR), pp 7–13Google Scholar
  12. 12.
    Cotroneo D, Grottke M, Natella R, Pietrantuono R, Trivedi KS (2013) Fault triggers in open-source software: an experience report. In: Proceedings 24th IEEE international symposium on software reliability engineering (ISSRE), pp 178–187Google Scholar
  13. 13.
    Lu S, Park S, Seo E, Zhou Y (2008) Learning from mistakes: a comprehensive study on real world concurrency bug characteristics. SIGARCH Comput Architect News 36(1):329–339CrossRefGoogle Scholar
  14. 14.
    Tan L, Liu C, Li Z, Wang X, Zhou Y, Zhai C (2014) Bug characteristics in open source software. Empirical Softw Eng 19(6):1665–1705CrossRefGoogle Scholar
  15. 15.
    Carrozza G, Cotroneo D, Natella R, Pietrantuono R, Russo S (2013) Analysis and prediction of mandelbugs in an industrial software system. In: Proceedings IEEE 6th international conference on software testing, verification and validation (ICST), pp 262–271Google Scholar
  16. 16.
    Cotroneo D, Natella R, Pietrantuono R (2013) Predicting aging-related bugs using software complexity metrics. Perform Eval 70(3):163–178CrossRefGoogle Scholar
  17. 17.
    Bovenzi A, Cotroneo D, Pietrantuono R, Russo S (2011) Workload characterization for software aging analysis. In: Proceedings 22nd IEEE international symposium on software reliability engineering (ISSRE), pp 240–249Google Scholar
  18. 18.
    Bovenzi A, Cotroneo D, Pietrantuono R, Russo S (2012) On the aging effects due to concurrency bugs: a case study on MySQL. In: Proceedings 23rd IEEE international symposium on software reliability engineering (ISSRE), pp 211–220Google Scholar
  19. 19.
    Cotroneo D, Natella R, Pietrantuono R (2010) Is software aging related to software metrics? In: Proceedings IEEE 2nd international workshop on software aging and rejuvenation (WoSAR), pp 1–6Google Scholar
  20. 20.
    Cotroneo D, Orlando S, Pietrantuono R, Russo S (2013) A measurement-based ageing analysis of the JVM. Softw Test Verif Reliab 23:199–239CrossRefGoogle Scholar
  21. 21.
    Chandra S, Chen PM (2000) Whither generic recovery from application faults? A fault study using open-source software. In: Proceedings international conference on dependable systems and networks (DSN), pp 97–106Google Scholar
  22. 22.
    Cavezza DG, Pietrantuono R, Russo S, Alonso J, Trivedi KS (2014) Reproducibility of environment-dependent software failures: an experience report. In: Proceedings 25th IEEE international symposium on software reliability engineering (ISSRE), pp 267–276Google Scholar
  23. 23.
    Pietrantuono R, Russo S, Trivedi K (2015) Emulating environment-dependent software faults. In: 2015 IEEE/ACM 1st international workshop on in complex faults and failures in large software systems (COUFLESS), pp 34–40Google Scholar
  24. 24.
    Fonseca P, Cheng L, Singhal V, Rodrigues R (2010) A study of the internal and external effects of concurrency bugs. In: Proceedings international conference on dependable systems and networks (DSN), pp 221–230Google Scholar
  25. 25.
    Nistor A, Jiang T, Tan L (2013) Discovering, reporting, and fixing performance bugs. In: Proceedings 10th conference on mining software repositories (MSR), pp 237–246Google Scholar
  26. 26.
    Jin G, Song L, Shi X, Scherpelz J, Lu S (2012) Understanding and detecting real-world performance bugs. In: Proceedings 33rd ACM SIGPLAN conference on programming languages design and implementation (PLDI), pp 77–88Google Scholar
  27. 27.
    Zaman S, Adams B, Hassan AE (2011) Security versus performance bugs: a case study on Firefox. In: Proceedings 8th conference on mining software repositories (MSR), pp 93–102Google Scholar
  28. 28.
    Li Z, Tan L, Wang X, Lu S, Zhou Y, Zhai C (2006) Have things changed now?: an empirical study of bug characteristics in modern open source software. In: Proceedings 1st workshop on architectural and system support for improving software dependability (ASID), pp 25–33Google Scholar
  29. 29.
    Lamkanfi A, Demeyer S, Soetens QD, Verdonck T (2011) Comparing mining algorithms for predicting the severity of a reported bug. In: Proceedings 15th European conference on software maintenance and reengineering (CSMR), pp 249–258Google Scholar
  30. 30.
    Menzies T, Marcus A (2008) Automated severity assessment of software defect reports. In: Proceedings IEEE international conference on software maintenance (ICSM), pp 346–355Google Scholar
  31. 31.
    Domingos P, Pazzani M (1997) On the optimality of the simple Bayesian classifier under zero-one loss. Mach Learn 29(23):103–130CrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Flavio Frattini
    • 1
  • Roberto Pietrantuono
    • 1
  • Stefano Russo
    • 1
  1. 1.Dipartimento di Ingegneria Elettrica e delle Tecnologie dell’InformazioneUniversità degli Studi di Napoli Federico IINapoliItaly

Personalised recommendations