Advertisement

Quality Rule Violations in SharePoint Applications: An Empirical Study in Industry

  • Apostolos AmpatzoglouEmail author
  • Paris Avgeriou
  • Thom Koenders
  • Pascal van Alphen
  • Ioannis Stamelos
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10027)

Abstract

In this paper, we focus on source code quality assessment for SharePoint applications, which is a powerful framework for developing software by combining imperative and declarative programming. In particular, we present an industrial case study conducted in a software consulting/development company in Netherlands, which aimed at: identifying the most common SharePoint quality rule violations and their severity. The results indicate that the most frequent rule violations are identified in the JavaScript part of the applications, and that the most severe ones are related to correctness, security and deployment. The aforementioned results can be exploited by both researchers and practitioners, in terms of future research directions, and to inform the quality assurance process.

Keywords

Quality assessment Defect prediction Sharepoint 

References

  1. 1.
    Ampatzoglou, A., Ampatzoglou, A., Chatzigeorgiou, A., Avgeriou, P.: The financial aspect of managing technical debt: a systematic literature review. Inf. Softw. Technol. 64(8), 52–73 (2015). ElsevierCrossRefGoogle Scholar
  2. 2.
    Charalampidou, S., Ampatzoglou, A., Avgeriou, P.: Size and cohesion metrics as indicators of the long method bad smell: an empirical study. In: 11th International Conference on Predictive Models and Data Analytics in Software Engineering (PROMISE 2015). ACM, Beijing, October 2015Google Scholar
  3. 3.
    Feitosa, D., Ampatzoglou, A., Avgeriou, A., Nakagawa E.Y.: Investigating quality trade-offs in open source critical embedded systems. In: 11th International Conference on the Quality of Software Architectures (QoSA 2015). ACM, Canada, May 2015Google Scholar
  4. 4.
    Hovemeyer, D., Pugh, W.: Finding bugs is easy. ACM SIGPLAN Not. 39(12), 92–106 (2004)CrossRefGoogle Scholar
  5. 5.
    Kitchenham, B., Pfleeger, S.L.: Principles of survey research part 2: designing a survey. ACM Spec. Interest Group Softw. 27(1), 18–20 (2002)Google Scholar
  6. 6.
    McConnell S.C.: Code Complete: A Practical Handbook of Software Construction. Microsoft Press, Redmond (2004)Google Scholar
  7. 7.
    Meyer, B.: Design and code reviews in the age of the internet. ACM Commun. 51(9), 66–71 (2008)CrossRefGoogle Scholar
  8. 8.
    Misra, S.C., Bhavsar, V.C.: Relationships between selected software measures and latent bug-density: guidelines for improving quality. In: Kumar, V., Gavrilova, Marina, L., Tan, C.J.K., L’Ecuyer, P. (eds.) ICCSA 2003. LNCS, vol. 2667, pp. 724–732. Springer, Heidelberg (2003). doi: 10.1007/3-540-44839-X_76 CrossRefGoogle Scholar
  9. 9.
    Runeson, P., Höst, M., Rainer, A., Regnell, B.: Case Study Research in Software Engineering: Guidelines and Examples. John Wiley and Sons, Inc. (2012)Google Scholar
  10. 10.
    Vokac, M.: Defect frequency and design patterns: an empirical study of industrial code. IEEE Trans. Softw. Eng. 30(12), 904–917 (2004)CrossRefGoogle Scholar
  11. 11.
    Van Vliet, H.: Software Engineering: Principles and Practice. Wiley & Sons, New York (2008)Google Scholar
  12. 12.
    Zaman, S., Adams, B., Hassan, A.E.: Security versus performance bugs. In: 8th Working Conference on Mining Software Repositories (MSR 2011), pp. 93–102 (2011)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Apostolos Ampatzoglou
    • 1
    Email author
  • Paris Avgeriou
    • 1
  • Thom Koenders
    • 2
  • Pascal van Alphen
    • 2
  • Ioannis Stamelos
    • 3
  1. 1.Department of Computer ScienceUniversity of GroningenGroningenNetherlands
  2. 2.SharePoint DepartmentCapgemini NetherlandsUtrechtNetherlands
  3. 3.Department of Computer ScienceAristotle UniversityThessalonikiGreece

Personalised recommendations