On the Need for a New Generation of Code Review Tools

  • Tobias Baum
  • Kurt Schneider
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10027)


Tool support for change-based code review is gaining widespread acceptance in the industry. This indicates that the current generation of tools is well-aligned to current code review practices. Nevertheless, we believe that further improvements in code review tooling can lead to increased review efficiency and effectiveness. In this paper, we combine results from a qualitative study and results from the literature to substantiate this claim. We derive promising improvement areas and provide an overview of existing research in these areas. A common attribute of these improvements is that they trade flexibility for reviewer support. As flexibility is one of the main characteristics of the current generation of code review tools in Hedberg’s classification of review tool generations, we regard these coming tools as part of a new generation of code review tools.


Code reviews Code inspections and walkthroughs Tool support 


  1. 1.
    Aurum, A., Petersson, H., Wohlin, C.: State-of-the-art: software inspections after 25 years. Softw. Test. Verification Reliab. 12(3), 133–154 (2002)CrossRefGoogle Scholar
  2. 2.
    Bacchelli, A., Bird, C.: Expectations, outcomes, and challenges of modern code review. In: Proceedings of the 2013 International Conference on Software Engineering, pp. 712–721. IEEE Press (2013)Google Scholar
  3. 3.
    Balachandran, V.: Reducing human effort and improving quality in peer code reviews using automatic static analysis and reviewer recommendation. In: Proceedings of the 2013 International Conference on Software Engineering. IEEE Press (2013)Google Scholar
  4. 4.
    Barnett, M., Bird, C., Brunet, J., Lahiri, S.K.: Helping developers help themselves: automatic decomposition of code review changesets. In: Proceedings of the 2015 International Conference on Software Engineering. IEEE Press (2015)Google Scholar
  5. 5.
    Baum, T., Liskin, O., Niklas, K., Schneider, K.: A faceted classification scheme for change-based industrial code review processes. In: 2016 IEEE International Conference on Software Quality, Reliability and Security (QRS). IEEE (2016)Google Scholar
  6. 6.
    Baum, T., Liskin, O., Niklas, K., Schneider, K.: Factors influencing code review processes in industry. In: Proceedings of the ACM SIGSOFT 24th International Symposium on the Foundations of Software Engineering. ACM (2016)Google Scholar
  7. 7.
    Baysal, O., Kononenko, O., Holmes, R., Godfrey, M.W.: Investigating technical and non-technical factors influencing modern code review. Empir. Softw. Eng. 21(3), 932–959 (2016). doi: 10.1007/s10664-015-9366-8 CrossRefGoogle Scholar
  8. 8.
    Biffl, S., Halling, M.: Investigating the influence of inspector capability factors with four inspection techniques on inspection performance. In: Eighth IEEE Symposium on Software Metrics, 2002, Proceedings, pp. 107–117. IEEE (2002)Google Scholar
  9. 9.
    Buse, R.P., Weimer, W.R.: Automatically documenting program changes. In: Proceedings of the IEEE/ACM international conference on Automated software engineering, pp. 33–42. ACM (2010)Google Scholar
  10. 10.
    Denger, C., Ciolkowski, M., Lanubile, F.: Investigating the active guidance factor in reading techniques for defect detection. In: International Symposium on Empirical Software Engineering, 2004, Proceedings, pp. 219–228. IEEE (2004)Google Scholar
  11. 11.
    Dias, M., Bacchelli, A., Gousios, G., Cassou, D., Ducasse, S.: Untangling fine-grained code changes. In: 2015 IEEE 22nd International Conference on Software Analysis, Evolution and Reengineering, pp. 341–350. IEEE (2015)Google Scholar
  12. 12.
    Dunsmore, A., Roper, M., Wood, M.: The role of comprehension in software inspection. J. Syst. Softw. 52(2), 121–129 (2000)CrossRefGoogle Scholar
  13. 13.
    Dunsmore, A., Roper, M., Wood, M.: Systematic object-oriented inspection - an empirical study. In: Proceedings of the 23rd International Conference on Software Engineering, pp. 135–144. IEEE Computer Society (2001)Google Scholar
  14. 14.
    Ge, X.: Improving tool support for software developers through refactoring detection. Ph.D. thesis, North Carolina State University (2014)Google Scholar
  15. 15.
    Gilb, T., Graham, D.: Software Inspection. Addison-Wesley, Wokingham (1993)Google Scholar
  16. 16.
    Gómez, V.U., Ducasse, S., D’Hondt, T.: Visually characterizing source code changes. Sci. Comput. Program. 98, 376–393 (2015)CrossRefGoogle Scholar
  17. 17.
    Hedberg, H.: Introducing the next generation of software inspection tools. In: Bomarius, F., Iida, H. (eds.) PROFES 2004. LNCS, vol. 3009, pp. 234–247. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24659-6_17 CrossRefGoogle Scholar
  18. 18.
    Kawrykow, D., Robillard, M.P.: Non-essential changes in version histories. In: Proceedings of the 33rd International Conference on Software Engineering, pp. 351–360. ACM (2011)Google Scholar
  19. 19.
    Laitenberger, O., Leszak, M., Stoll, D., El Emam, K.: Quantitative modeling of software reviews in an industrial setting. In: Sixth International, Software Metrics Symposium, 1999, Proceedings, pp. 312–322. IEEE (1999)Google Scholar
  20. 20.
    McNair, A., German, D.M., Weber-Jahnke, J.: Visualizing software architecture evolution using change-sets. In: 14th Working Conference on Reverse Engineering, 2007, WCRE 2007, pp. 130–139. IEEE (2007)Google Scholar
  21. 21.
    Porter, A., Siy, H., Mockus, A., Votta, L.: Understanding the sources of variation in software inspections. ACM Trans. Softw. Eng. Methodol. (TOSEM) 7(1), 41–79 (1998)CrossRefGoogle Scholar
  22. 22.
    Raz, T., Yaung, A.T.: Factors affecting design inspection effectiveness in software development. Inf. Softw. Technol. 39(4), 297–305 (1997)CrossRefGoogle Scholar
  23. 23.
    Rigby, P.C.: Understanding open source software peer review: review processes, parameters and statistical models, and underlying behaviours and mechanisms. Ph.D. thesis, University of Victoria (2011)Google Scholar
  24. 24.
    Rigby, P.C., Bird, C.: Convergent contemporary software peer review practices. In: Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering, pp. 202–212. ACM (2013)Google Scholar
  25. 25.
    Roper, M., Wood, M., Miller, J.: An empirical evaluation of defect detection techniques. Inf. Softw. Technol. 39(11), 763–775 (1997)CrossRefGoogle Scholar
  26. 26.
    Tao, Y., Dang, Y., Xie, T., Zhang, D., Kim, S.: How do software engineers understand code changes? an exploratory study in industry. In: Proceedings of the ACM SIGSOFT 20th International Symposium on the Foundations of Software Engineering. ACM (2012)Google Scholar
  27. 27.
    Tao, Y., Kim, S.: Partitioning composite code changes to facilitate code review. In: 2015 IEEE/ACM 12th Working Conference on Mining Software Repositories (MSR), pp. 180–190. IEEE (2015)Google Scholar
  28. 28.
    Thangthumachit, S., Hayashi, S., Saeki, M.: Understanding source code differences by separating refactoring effects. In: 2011 18th Asia Pacific Software Engineering Conference (APSEC), pp. 339–347. IEEE (2011)Google Scholar
  29. 29.
    Thongtanunam, P., Tantithamthavorn, C., Kula, R.G., Yoshida, N., Iida, H., Matsumoto, K.-I.: Who should review my code? a file location-based code-reviewer recommendation approach for modern code review. In: 2015 IEEE 22nd International Conference on Software Analysis, Evolution and Reengineering (SANER) (2015)Google Scholar
  30. 30.
    Zhang, T., Song, M., Pinedo, J., Kim, M.: Interactive code review for systematic changes. In: Proceedings of 37th IEEE/ACM International Conference on Software Engineering. IEEE (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.FG Software EngineeringLeibniz Universität HannoverHannoverGermany

Personalised recommendations