Advertisement

On the Assessment of Security and Performance Bugs in Chromium Open-Source Project

  • Joseph Imseis
  • Costain Nachuma
  • Shaikh ArifuzzamanEmail author
  • Zakirul Alam Bhuiyan
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1123)

Abstract

An individual working in software development should have a fundamental understanding of how different types of bugs impact various project aspects. This knowledge allows one to improve the quality of the created software. The problem, however, is that previous research typically treats all types of bugs as similar when analyzing several aspects of software quality (e.g. predicting the time to fix a bug) or concentrates on a particular bug type (e.g. performance bugs) with little comparison to other types. In this paper, we look at how different types of bugs, specifically performance and security bugs, differ from one another. Our study is based on a previous study done by Zaman et al. [1] in which the study was performed on the FireFox project. Instead of Firefox, we will be working with the Chrome web-browser. In our case study, we find that security bugs are fixed faster than performance bugs and that the developers who were assigned to security bugs typically have more experience than the ones assigned to performance bugs. Our findings emphasize the importance of considering the different types of bugs in software quality research and practice.

Keywords

Assessment of security Performance bugs Data mining Software repository Chromium project Open-source project 

References

  1. 1.
    Zaman, S., Adams, B., Hassan, A.E.: Security versus performance bugs: a case study on firefox. In: Proceeding of the 8th Working Conference on Mining Software Repositories (2011)Google Scholar
  2. 2.
    Shihab, E., et al.: Predicting re-opened bugs: a case study on the eclipse project. In: Proceedings of the 17th Working Conference on Reverse Engineering, WCRE 2010, Washington, DC, USA, pp. 249–258 (2010)Google Scholar
  3. 3.
    Erlikh, L.: Leveraging legacy system dollars for e-business. IT Prof. 2(3), 17–23 (2000)CrossRefGoogle Scholar
  4. 4.
    Jaafar, F., Lozano, A., Gueheneuc, Y.-G., Mens, K.: On the analysis of co-occurrence of anti-patterns and clones. In: 2017 IEEE International Conference on Software Quality, Reliability and Security (QRS) (2017)Google Scholar
  5. 5.
    Panjer, L.D.: Predicting eclipse bug lifetimes. In: Proceedings of the 4th International Workshop on Mining Software Repositories, MSR 2007, Washington, DC, USA (2007)Google Scholar
  6. 6.
    Weiss, C., Premraj, R., Zimmermann, T., Zeller, A.: How long will it take to fix this bug? In: Proceedings of the 4th International Workshop on Mining Software Repositories, MSR 2007, Washington, DC, USA (2007)Google Scholar
  7. 7.
    Kim, S., Whitehead Jr., E.J.: How long did it take to fix bugs? In: Proceedings of the 3rd International Workshop on Mining Software Repositories. ACM, New York (2006)Google Scholar
  8. 8.
    Anvik, J., Hiew, L., Murphy, G.C.: Who should fix this bug? In: Proceedings of the 28th International Conference on Software Engineering, ICSE 2006, pp. 361–370, New York, NY, USA (2006)Google Scholar
  9. 9.
    Apache Hadoop. https://hadoop.apache.org/. Accessed 30 Apr 2019
  10. 10.
    Ackerman, M.S., Halverson, C.: Considering an organization’s memory. In: Proceedings of the 1998 ACM conference on Computer Supported Cooperative Work, CSCW 1998, pp. 39–48. ACM, New York (1998)Google Scholar
  11. 11.
    Draft Standard for IEEE Standard Classification for Software Anomalies. IEEE Unapproved Draft Std P1044/D00003, February 2009Google Scholar
  12. 12.
    Gegick, M., Rotella, P., Xie, T.: Identifying security bug reports via text mining: an industrial case study. In: Proceedings of the 7th International Workshop on Mining Software Repositories, pp. 11–20, May 2010Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  • Joseph Imseis
    • 1
  • Costain Nachuma
    • 1
  • Shaikh Arifuzzaman
    • 1
    • 2
    Email author
  • Zakirul Alam Bhuiyan
    • 3
  1. 1.Computer Science DepartmentUniversity of New OrleansNew OrleansUSA
  2. 2.Big Data and Scalable Computing Research (BDSC)UNONew OrleansUSA
  3. 3.Computer Science DepartmentFordham UniversityBronxUSA

Personalised recommendations