Advertisement

Assisting software engineering students in analyzing their performance in software development

  • Mushtaq RazaEmail author
  • João Pascoal Faria
  • Rafael Salazar
Article
  • 51 Downloads

Abstract

Collecting product and process measures in software development projects, particularly in education and training environments, is important as a basis for assessing current performance and opportunities for improvement. However, analyzing the collected data manually is challenging because of the expertise required, the lack of benchmarks for comparison, the amount of data to analyze, and the time required to do the analysis. ProcessPAIR is a novel tool for automated performance analysis and improvement recommendation; based on a performance model calibrated from the performance data of many developers, it automatically identifies and ranks potential performance problems and root causes of individual developers. In education and training environments, it increases students’ autonomy and reduces instructors’ effort in grading and feedback. In this article, we present the results of a controlled experiment involving 61 software engineering master students, half of whom used ProcessPAIR in a Personal Software Process (PSP) performance analysis assignment, and the other half used a traditional PSP support tool (Process Dashboard) for performing the same assignment. The results show significant benefits in terms of students’ satisfaction (average score of 4.78 in a 1–5 scale for ProcessPAIR users, against 3.81 for Process Dashboard users), quality of the analysis outcomes (average grades achieved of 88.1 in a 0–100 scale for ProcessPAIR users, against 82.5 for Process Dashboard users), and time required to do the analysis (average of 252 min for ProcessPAIR users, against 262 min for Process Dashboard users, but with much room for improvement).

Keywords

Performance analysis Software engineering education Controlled experiment Personal software process 

Notes

Acknowledgements

The authors would like to acknowledge the SEI and Tec de Monterrey for facilitating the access to the PSP data for performing this research and AWKUM for their partial initial grant. The authors would also like to acknowledge the students of Tec de Monterrey who participated in the controlled experiment. This work is partially financed by the ERDF – European Regional Development Fund through the Operational Programme for Competitiveness and Internationalisation - COMPETE 2020 Programme within the project POCI-01-0145-FEDER-006961, and by National Funds through the FCT – Fundação para a Ciência e a Tecnologia as part of project UID/EEA/50014/2013 and research grant SFRH/BD/85174/2012.

Funding information

This work is partially financed by the ERDF – European Regional Development Fund through the Operational Programme for Competitiveness and Internationalisation - COMPETE 2020 Programme within the project POCI-01-0145-FEDER-006961, and by National Funds through the FCT – Fundação para a Ciência e a Tecnologia as part of project UID/EEA/50014/2013 and research grant SFRH/BD/85174/2012.

References

  1. Alperowitz, L., Dzvonyar, D., Bruegge, B. (2016). Metrics in agile project courses. In Proceedings of the 38th international conference on software engineering companion (ICSE ’16) (pp. 323–326). New York: ACM, DOI  https://doi.org/10.1145/2889160.2889183, (to appear in print).
  2. Alves, T. (2012). Benchmark-based software product quality evaluation. PhD Thesis, U. Minho.Google Scholar
  3. Alves, T., Ypma, C., Visser, J. (2010). Deriving metric thresholds from benchmark data. In 2010 IEEE international conference on software maintenance (ICSM’10) (pp. 1–10), IEEE.  https://doi.org/10.1109/ICSM.2010.5609747.
  4. Basili, V.R. (2007). The role of controlled experiments in software engineering research, empirical software engineering issues. Critical assessment and future directions. In Basili, V.R., Rombach, D., Schneider, K., Kitchenham, B., Pfahl, D., Selby, R. (Eds.) LNCS (pp. 33–37). Berlin: Springer.Google Scholar
  5. Beck, K., & Andres, C. (2004). Extreme programming explained: embrace change, 2nd Edn. Addison-Wesley.Google Scholar
  6. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J. (1983). Classification and regression trees. Belmont.Google Scholar
  7. Fowler, M., Beck, K., Brant, J., Opdyke, W., Roberts, D. (1999). Refactoring: improving the design of existing code. Addison-Wesley.Google Scholar
  8. Hackystat Development Team. (2010). Hackystat [online], available: http://code.google.com/p/hackystat/ (last release: January 2010; last visited October 2016).
  9. Humphrey, W. (2005). PSP SM: a self-improvement process for software engineers. Addison-Wesley Professional.Google Scholar
  10. Humphrey, W. (2009). The software quality profile. White paper, SEI.Google Scholar
  11. Jedlitschka, A., Ciolkowski, M., Pfahl, D. (2008). Reporting experiments in software engineering. Guide to advanced empirical software engineering.Google Scholar
  12. Ko, A.J., Latoza, T.D., Burnett, M.M. (2015). A practical guide to controlled experiments of software engineering tools with human participants. Empirical Software Engineering, 20(1), 110–141.CrossRefGoogle Scholar
  13. Kumar, V., Kinshuk, Somasundaram, T., Harris, S., Boulanger, D., Seanosky, J., Paulmani, G., Panneerselvam, K. (2015). An approach to measure coding competency evolution: toward learning analytics, smart learning environments. In Chang, M. & Li, Y. (Eds.) Lecture notes in educational technology (pp. 27–43). Springer.  https://doi.org/10.1007/978-3-662-44447-4_2.
  14. Raza, M. (2017). Automated software process performance analysis and improvement recommendation. PhD Thesis, Faculty of Engineering of the University of Porto.Google Scholar
  15. Raza, M., & Faria, J. (2016a). A model for analyzing performance problems and root causes in the personal software process. Journal of Software: Evolution and Process, 28(4), 254–271.Google Scholar
  16. Raza, M., & Faria, J. (2016b). ProcessPAIR: a tool for automated performance analysis and improvement recommendation in software development. In Proceedings of the 31st IEEE/ACM international conference on automated software engineering (ASE 2016) (pp. 798–803). ACM.  https://doi.org/10.1145/2970276.2970284.
  17. Raza, M., Faria, J.J., Salazar, R. (2016). Empirical evaluation of the ProcessPAIR tool for automated performance analysis. In 28th international conference on software engineering and knowledge engineering (SEKE 2016).  https://doi.org/10.18293/SEKE2016-205.
  18. Raza, M., Faria, J., Salazar, R. (2017). Helping software engineering students analyzing their performance data: tool support in an educational environment. Published in 39th international conference on software engineering (ICSE). Buenos Aires, Argentina.Google Scholar
  19. Rong, G., Zhang, H., Qi, S., Shao, D. (2016). Can engineering students program defect-free? An educational approach. In Proceedings of the 38th international conference on software engineering companion (ICSE ’16) (pp. 364–373). ACM.  https://doi.org/10.1145/2889160.2889189.
  20. Saltelli, A., Chan, K., Scott, E. (2008). Sensitivity analysis. Wiley.Google Scholar
  21. Shalizi, C. (2009). Classification and regression trees, 36–350, Data Mining. Standford University, (lecture notes).Google Scholar
  22. Sjøberg, D.I.K., Hannay, J.E., Hansen, O., Kampenes, V.B., Karahasanovic, A., Liborg, N.-K., Rekdal, A. (2005). A survey of controlled experiments in software engineering. IEEE Transactions on Software Engineering, 31.9, 733–753.CrossRefGoogle Scholar
  23. Shadish, W.R., Cook, T.D., Campbell, D.T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin.Google Scholar
  24. Shin, H., Choi, H., Baik, J. (2007). Jasmine: a PSP supporting tool. In Proceedings of the international conference on software process (ICSP 2007), LNCS 4470 (pp. 73–83). Springer.  https://doi.org/10.1007/978-3-540-72426-1_7.
  25. Thisuk, S., & Ramingwong, S. (2014). WBPS: a new web based tool for personal software process. In 2014 11th international conference on electrical engineering/electronics, computer, telecommunications and information technology. IEEE.Google Scholar
  26. Tuma Solutions LLC. (2015). Process Dashboard [online], available: http://www.processdash.com (last release: December 2015; last visited October 2016).
  27. Visser, J. (2015). Building maintainable software. O’Reilly, ISBN13: 9781491940662.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.INESC TEC/ Faculty of EngineeringUniversity of PortoPortoPortugal
  2. 2.Tecnológico de MonterreyMonterreyMexico

Personalised recommendations