Improving Case Based Software Effort Estimation Using a Multi-criteria Decision Technique

  • Fadoua FellirEmail author
  • Khalid Nafil
  • Rajaa Touahni
  • Lawrence Chung
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 763)


Producing an accurate effort estimate is essential for effective software project management, and yet remains highly challenging and difficult to achieve, especially at the early stage of software development, because very little detail about the project are known at its beginning. To cope with this challenge, we present a novel framework for software effort estimation, which takes an incremental approach on one hand, using a case-based reasoning (CBR) model, while considering a comprehensive set of different types of requirements models on the other hand, including functional requirements (FRs), non-functional requirements (NFRs), and domain properties (DPs). Concerning the use of CBR, this framework offers a multi-criteria technique for enhancing the accuracy of similarity measures among cases of multiple past projects that are similar to the current software project, towards determining and selecting the most similar one. We have tested our proposed framework on 36 (students’) projects and the results are very encouraging, in the sense that the difference between the estimated effort and the actual effort was lower than 10% in most cases.


FRs (Functional requirements) NFRs (Non-functional requirements) Software effort estimation Case based reasoning (CBR) Multi-criteria decision analysis (MCDA) 


  1. 1.
    Hughes, R.T.: Expert judgement as an estimating method. Inf. Softw. Technol. 38(2), 67–75 (1996)Google Scholar
  2. 2.
    Lopez-Martin, C., Abran, A.: Applying expert judgment to improve an individual’s ability to predict software development effort. Int. J. Software Eng. Knowl. Eng. 22(04), 467–483 (2012)Google Scholar
  3. 3.
    Shepperd, M., Schofield, C.: Estimating software project effort using analogies. IEEE Trans. Softw. Eng. 23(11), 736–743 (1997)Google Scholar
  4. 4.
    Briand, L.C., Wieczorek, I.: Resource estimation in software engineering. Encyclopedia of software engineering (2002)Google Scholar
  5. 5.
    Silhavy, R., Silhavy, P., Prokopova, Z.: Evaluating Subset Selection Methods for Use Case Points Estimation. Information and Software Technology (2017)Google Scholar
  6. 6.
    Silhavy, R., Silhavy, P., Prokopova, Z.: Analysis and selection of a regression model for the Use Case Points method using a stepwise approach. J. Syst. Softw. 125, 1–14 (2017)Google Scholar
  7. 7.
    Wen, J., Li, S., Lin, Z., Hu, Y., Huang, C.: Systematic literature review of machine learning based software development effort estimation models. Inf. Softw. Technol. 54(1), 41–59 (2012)Google Scholar
  8. 8.
    Singh, Y., Kaur, A., Bhatia, P.K., et al.: Predicting software development effort using artificial neural network. Int. J. Software Eng. Knowl. Eng. 20(03), 367–375 (2010)Google Scholar
  9. 9.
    Wu, D., Li, J., Liang, Y.: Linear combination of multiple case-based reasoning with optimized weight for software effort estimation. J. Super Comput. 64(3), 898–918 (2013)Google Scholar
  10. 10.
    Aamodt, A., Plaza, E.: Case-based reasoning: foundational issues, methodological variations, and system approaches. AI Commun. 7(1), 39–59 (1994)Google Scholar
  11. 11.
    Finnie, G.R., Wittig, G.E., Desharnais, J.M.: A comparison of software effort estimation techniques: using function points with neural networks, case-based reasoning and regression models. J. Syst. Softw. 39(3), 281–289 (1997)Google Scholar
  12. 12.
    Heemstra, F.J.: Software cost estimation. Inf. Softw. Technol. 34(10), 627–639 (1992)Google Scholar
  13. 13.
    Fenton, N., Bieman, J.: Software Metrics: A Rigorous and Practical Approach. CRC Press, Boca Raton (2014)zbMATHGoogle Scholar
  14. 14.
    Chung, L., do Prado Leite, J.C.S.: On non-functional requirements in software engineering. In: Conceptual Modeling: Foundations and Applications, pp. 363–379. Springer, Heidelberg (2009)Google Scholar
  15. 15.
    Jacobson, I., Booch, G., Rumbaugh, J., Rumbaugh, J., Booch, G.: The Unified Software Development Process, vol. 1. Addison-wesley, Reading (1999)Google Scholar
  16. 16.
    Kotonya, G., Sommerville, I.: Requirements Engineering: Processes and Techniques. Wiley, New York (1998)Google Scholar
  17. 17.
    Gunter, C.A., Gunter, E.L., Jackson, M., Zave, P.: A reference model for requirements and specifications. IEEE Softw. 17(3), 37–43 (2000)Google Scholar
  18. 18.
    Smiti, A., Elouedi, Z.: Overview of Maintenance for Case based Reasoning Systems (2011)Google Scholar
  19. 19.
    Burkhard, H.-D., Richter, M.M.: On the notion of similarity in case based reasoning and fuzzy theory. In: Soft Computing in Case Based Reasoning, pp. 29–45. Springer, London (2001)zbMATHGoogle Scholar
  20. 20.
    Walkerden, F., Jeffery, R.: An empirical study of analogy-based software effort estimation. Empirical Softw. Eng. 4(2), 135–158 (1999)Google Scholar
  21. 21.
    Greco, S., Figueira, J., Ehrgott, M.: Multiple Criteria Decision Analysis. Springer’s International Series (2005)Google Scholar
  22. 22.
    Triantaphyllou, E.: Multi-criteria Decision Making Methods: A Comparative Study, vol. 44. Springer Science & Business Media, New York (2013)zbMATHGoogle Scholar
  23. 23.
    López-Martín, C.: Predictive accuracy comparison between neural networks and statistical regression for development effort of software projects. Appl. Soft Comput. 27, 434–449 (2015)Google Scholar
  24. 24.
    Idri, A., Amazal, F.A.: Software cost estimation by fuzzy analogy for ISBSG repository. In: Uncertainty Modeling in Knowledge Engineering and Decision Making, pp. 863–868 (2012)Google Scholar
  25. 25.
    Azzeh, M., Elsheikh, Y.: Learning best K analogies from data distribution for case-based software effort estimation. In: The Seventh International Conference on Software Engineering Advances, pp. 341–347 (2012)Google Scholar
  26. 26.
    Huang, S.J., Chiu, N.H., Chen, L.W.: Integration of the grey relational analysis with genetic algorithm for software effort estimation. Eur. J. Oper. Res. 188(3), 898–909 (2008)zbMATHGoogle Scholar
  27. 27.
    Huang, S.J., Lin, C.Y., Chiu, N.H.: Fuzzy decision tree approach for embedding risk assessment information into software cost estimation model. J. Inf. Sci. Eng. 22(2), 297–313 (2006)Google Scholar
  28. 28.
    Mendes, E., Mosley, N., Counsell, S.: A replicated assessment of the use of adaptation rules to improve Web cost estimation. In: International Symposium on Empirical Software Engineering, ISESE 2003, Proceedings, pp. 100–109. IEEE, September 2003Google Scholar
  29. 29.
    Li, Y.F., Xie, M., Goh, T.N.: A study of the non-linear adjustment for analogy based software cost estimation. Empirical Softw. Eng. 14(6), 603–643 (2009)Google Scholar
  30. 30.
    Azzeh, M.: Adjusted case-based software effort estimation using bees optimization algorithm. In: International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, pp. 315–324. Springer, Heidelberg, September 2011Google Scholar
  31. 31.
    Li, Y.F., Xie, M., Goh, T.N.: A study of project selection and feature weighting for analogy based software cost estimation. J. Syst. Softw. 82(2), 241–252 (2009)Google Scholar
  32. 32.
    Chiu, N.H., Huang, S.J.: The adjusted analogy-based software effort estimation based on similarity distances. J. Syst. Softw. 80(4), 628–640 (2007)Google Scholar
  33. 33.
    Azzeh, M., Neagu, D., Cowling, P.: Software project similarity measurement based on fuzzy C-means. In: International Conference on Software Process, pp. 123–134. Springer, Heidelberg, May 2008Google Scholar
  34. 34.
    Han, L., Kashyap, A., Finin, T., Mayfield, J., Weese, J.: UMBC EBIQUITY-CORE: semantic textual similarity systems. In: Proceedings of the Second Joint Conference on Lexical and Computational Semantics, vol. 1, pp. 44–52, June 2013Google Scholar
  35. 35.
    Chang, T.H., Wang, T.C.: Using the fuzzy multi-criteria decision making approach for measuring the possibility of successful knowledge management. Inf. Sci. 179(4), 355–370 (2009)Google Scholar
  36. 36.
    Finnie, G., Sun, Z.: R 5 model for case-based reasoning. Knowl.-Based Syst. 16(1), 59–65 (2003)Google Scholar
  37. 37.
    ISO/IEC, ISO/IEC 25010:2010 SOFTWARE ENGINEERING-Software Product Quality Requirements and Evaluation (Square)System and Software Quality Models, ISO/IEC JTC 1/SC 7 (2010)Google Scholar
  38. 38.
    Boehm, B.W.: Software Engineering Economics, vol. 197. Prentice-Hall, Englewood Cliffs (1981)zbMATHGoogle Scholar
  39. 39.
    Zadeh, L.A.: Similarity relations and fuzzy orderings. Inf. Sci. 3(2), 177–200 (1971)MathSciNetzbMATHGoogle Scholar
  40. 40.
    Fellir, F., Nafil, K., Touahni, R.: System requirements prioritization based on AHP. In: 2014 Third IEEE International Colloquium in Information Science and Technology (CIST), pp. 163–167. IEEE (2014)Google Scholar
  41. 41.
    Foss, T., Stensrud, E., Kitchenham, B., Myrtveit, I.: A simulation study of the model evaluation criterion MMRE. IEEE Trans. Softw. Eng. 29(11), 985–995 (2003)Google Scholar
  42. 42.
    Kitchenham, B.A., Pickard, L.M., MacDonell, S.G., Shepperd, M.J.: What accuracy statistics really measure [software estimation]. IEE Proc. Softw. 148(3), 81–85 (2001)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  • Fadoua Fellir
    • 1
    Email author
  • Khalid Nafil
    • 2
  • Rajaa Touahni
    • 1
  • Lawrence Chung
    • 3
  1. 1.Lastid Laboratory, Faculty of SciencesIbn Tofail UniversityKenitraMorocco
  2. 2.Mohamed V University ENSIASRabatMorocco
  3. 3.Erik Johnson School of Engineering and Computer ScienceThe University of Texas at DallasRichardsonUSA

Personalised recommendations