Advertisement

Predicting Software Metrics at Design Time

  • Wolfgang Holz
  • Rahul Premraj
  • Thomas Zimmermann
  • Andreas Zeller
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5089)

Abstract

How do problem domains impact software features? We mine software code bases to relate problem domains (characterized by imports) to code features such as complexity, size, or quality. The resulting predictors take the specific imports of a component and predict its size, complexity, and quality metrics. In an experiment involving 89 plug-ins of the ECLIPSE project, we found good prediction accuracy for most metrics. Since the predictors rely only on import relationships, and since these are available at design time, our approach allows for early estimation of crucial software metrics.

Keywords

Support Vector Machine Output Feature Design Time Software Metrics Code Metrics 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jorgensen, M., Shepperd, M.J.: A systematic review of software development cost estimation studies. IEEE Transactions on Software Engineering 33(1), 33–53 (2007)CrossRefGoogle Scholar
  2. 2.
    Delany, S.J.: The design of a case representation for early software development cost estimation. Master’s thesis, Stafford University, U.K. (1998)Google Scholar
  3. 3.
    Schröter, A., Zimmermann, T., Zeller, A.: Predicting component failures at design time. In: Proceedings of the 5th International Symposium on Empirical Software Engineering, September 2006, pp. 18–27 (2006)Google Scholar
  4. 4.
    Neuhaus, S., Zimmermann, T., Holler, C., Zeller, A.: Predicting vulnerable software components. In: Proceedings of the 14th ACM Conference on Computer and Communications Security (October 2007)Google Scholar
  5. 5.
    Wheeler, D.A.: SLOCCount user’s guide (Last accessed 23-11-2007), http://www.dwheeler.com/sloccount/sloccount.html
  6. 6.
    Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Transactions on Software Engineering 20(6), 476–493 (1994)Google Scholar
  7. 7.
    Boehm, B.: Software Engineering Economics. Prentice-Hall, Englewood Cliffs (1981)zbMATHGoogle Scholar
  8. 8.
    Putnam, L.H., Myers, W.: Measures for excellence: reliable software on time, within budget. Yourdon Press, Englewood Cliffs (1991)Google Scholar
  9. 9.
    Mendes, E., Kitchenham, B.A.: Further comparison of cross-company and within-company effort estimation models for web applications. In: IEEE METRICS, pp. 348–357. IEEE Computer Society, Los Alamitos (2004)Google Scholar
  10. 10.
    Shepperd, M.J., Schofield, C.: Estimating software project effort using analogies. IEEE Transactions on Software Engineering 23(11), 736–743 (1997)CrossRefGoogle Scholar
  11. 11.
    Kirsopp, C., Mendes, E., Premraj, R., Shepperd, M.J.: An empirical analysis of linear adaptation techniques for case-based prediction. In: Ashley, K.D., Bridge, D.G. (eds.) ICCBR 2003. LNCS, vol. 2689, pp. 231–245. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  12. 12.
    Mendes, E., Mosley, N., Counsell, S.: Exploring case-based reasoning for web hypermedia project cost estimation. International Journal of Web Engineering and Technology 2(1), 117–143 (2005)CrossRefGoogle Scholar
  13. 13.
    Mendes, E.: A comparison of techniques for web effort estimation. In: ESEM, pp. 334–343. IEEE Computer Society, Los Alamitos (2007)Google Scholar
  14. 14.
    Marques, M.: Eclipse AST Parser (Last accessed 14-01-2008), http://www.ibm.com/developerworks/opensource/library/os-ast/
  15. 15.
    Basili, V., Briand, L., Melo, W.: A validation of object-oriented design metrics as quality indicators. IEEE Transactions on Software Engineering 22(10), 751–761 (1996)CrossRefGoogle Scholar
  16. 16.
    Alshayeb, M., Li, W.: An empirical validation of object-oriented metrics in two different iterative software processes. IEEE Transations of Software Engineering 29(11), 1043–1049 (2003)CrossRefGoogle Scholar
  17. 17.
    Ronchetti, M., Succi, G., Pedrycz, W., Russo, B.: Early estimation of software size in object-oriented environments: a case study in a CMM level 3 software firm. Technical report, Informatica e Telecomunicazioni, University of Trento (2004)Google Scholar
  18. 18.
    Aggarwal, K.K., Singh, Y., Kaur, A., Malhotra, R.: Empirical study of object-oriented metrics. Journal of Object Technology 5(8) (2006)Google Scholar
  19. 19.
    Subramanyam, R., Krishnan, M.: Empirical analysis of ck metrics for object-oriented design complexity: Implications for software defects. IEEE Transactions on Software Engineering 29(4), 297–310 (2003)CrossRefGoogle Scholar
  20. 20.
    Andersson, M., Vestergren, P.: Object-oriented design quality metrics. Master’s thesis, Uppsala University, Uppsala, Sweden (June 2004)Google Scholar
  21. 21.
    Thwin, M.M.T., Quah, T.S.: Application of neural networks for software quality prediction using object-oriented metrics. Journal of Systems and Software 76(2), 147–156 (2005)CrossRefGoogle Scholar
  22. 22.
    Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Statistics and Computing 14, 199–222 (2004)CrossRefMathSciNetGoogle Scholar
  23. 23.
    Foss, T., Stensrud, E., Kitchenham, B., Myrveit, I.: A simulation study of the model evaluation criterion MMRE. IEEE Transactions on Software Engineering 29(11), 985–995 (2003)CrossRefGoogle Scholar
  24. 24.
    Chambers, J.M., Cleveland, W.S., Kleiner, B., Tukey, P.A.: Graphical Methods for Data Analysis. Wadsworth (1983)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Wolfgang Holz
    • 1
  • Rahul Premraj
    • 1
  • Thomas Zimmermann
    • 2
  • Andreas Zeller
    • 1
  1. 1.Saarland UniversityGermany
  2. 2.University of CalgaryCanada

Personalised recommendations