Advertisement

Detecting and Addressing Design Smells in Novice Processing Programs

  • Ansgar FehnkerEmail author
  • Remco de Man
Conference paper
  • 150 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1022)

Abstract

Many novice programmers are able to write code that solves a given problem, but they struggled to write code that adheres to basic principles of good application design. Their programs will contain several design smells which indicate a lack of understanding of how to structure code. This applies in particular to degrees in which programming, and by extension software design, is only a small part of the curriculum.

This paper defines design smells for Processing, a language for new media and visual arts that is based on Java. This includes language specific smells that arise from the common structure that all Processing programs share. The paper also describes how to detect those smells automatically with static analysis. This tool is meant to support teaching staff with providing feedback to novices on program design.

We applied the tool to a large set of student programs, as well as programs from the Processing community, and code examples used by textbooks and instructors. The latter gave a good sense of the quality of resources that students use for reference. We found that a surprising number of resources contains at least some design smell. The paper then describes how to refactor the code to avoid these smells. These guidelines are meant to be practical and fitting the concepts and constructs that are known to first-year students.

References

  1. 1.
    Hermans, F., Aivaloglou, E.: Do code smells hamper novice programming? A controlled experiment on scratch programs. In: ICPC 2016, pp. 1–10 (2016)Google Scholar
  2. 2.
    Suryanarayana, G., Samarthyam, G., Sharma, T.: Refactoring for Software Design Smells: Managing Technical Debt, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco (2014)Google Scholar
  3. 3.
    Reas, C., Fry, B.: Processing: A Programming Handbook for Visual Designers and Artists. The MIT Press, Cambridge (2007)Google Scholar
  4. 4.
    Man, R., Fehnker, A.: The smell of processing. In: Proceedings of the 10th International Conference on Computer Supported Education, vol. 2 (2018)Google Scholar
  5. 5.
    Stegeman, M., Barendsen, E., Smetsers, S.: Towards an empirically validated model for assessment of code quality. In: Koli Calling 2014, pp. 99–108. ACM, New York (2014)Google Scholar
  6. 6.
    Hermans, F., Stolee, K.T., Hoepelman, D.: Smells in block-based programming languages. In: 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (2016)Google Scholar
  7. 7.
    Aivaloglou, E., Hermans, F.: How kids code and how we know: an exploratory study on the scratch repository. In: ICER 2016. ACM, New York (2016)Google Scholar
  8. 8.
    Lahtinen, E., Ala-Mutka, K., Järvinen, H.M.: A study of the difficulties of novice programmers. SIGCSE Bull. 37, 14–18 (2005)CrossRefGoogle Scholar
  9. 9.
    Blok, T., Fehnker, A.: Automated program analysis for novice programmers. In: HEAd 2017, Universitat Politecnica de Valencia (2016)Google Scholar
  10. 10.
    Keuning, H., Jeuring, J., Heeren, B.: Towards a systematic review of automated feedback generation for programming exercises. In: ITiCSE 2016. ACM, New York (2016)Google Scholar
  11. 11.
    Higgins, C.A., Gray, G., Symeonidis, P., Tsintsifas, A.: Automated assessment and experiences of teaching programming. J. Educ. Resour. Comput. 5, 5 (2005)CrossRefGoogle Scholar
  12. 12.
    Lelli, V., Blouin, A., Baudry, B., Coulon, F., Beaudoux, O.: Automatic detection of GUI design smells: the case of Blob listener. In: Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, EICS 2016. ACM (2016)Google Scholar
  13. 13.
    Goetz, B.: Java Concurrency in Practice. Addison-Wesley, Upper Saddle River (2006)Google Scholar
  14. 14.
    Lanza, M.: Object-Oriented Metrics in Practice: Using Software Metrics to Characterize, Evaluate, and Improve the Design of Object-Oriented Systems. Springer, Heidelberg (2006).  https://doi.org/10.1007/3-540-39538-5CrossRefzbMATHGoogle Scholar
  15. 15.
    Fehnker, A., Huuck, R., Seefried, S., Tapp, M.: Fade to grey: tuning static program analysis. Electron. Notes Theor. Comput. Sci. 266, 17–32 (2010)CrossRefGoogle Scholar
  16. 16.
    de Man, R.: The smell of poor design. In: 26th Twente Student Conference on IT, University of Twente (2017)Google Scholar
  17. 17.
    Shiffman, D.: Learning Processing: A Beginner’s Guide to Programming Images, Animation, and Interaction, 2nd edn. Morgan Kaufmann Publishers Inc., San Francisco (2016)Google Scholar
  18. 18.
    Okun, V., Delaitre, A., Black, P.E.: Report on the static analysis tool exposition (SATE) IV. NIST Spec. Publ. 500, 297 (2013)Google Scholar
  19. 19.
    Gamma, E., Helm, R., Johnson, R., Vlissides, J.: Design Patterns: Elements of Reusable Object-oriented Software. Addison-Wesley Longman Publishing Co., Inc., Boston (1995)zbMATHGoogle Scholar
  20. 20.
    Perevalov, D., Tatarnikov, I.S.: openFrameworks Essentials. Packt Publishing, Birmingham (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Formal Methods and Tools Group, Faculty of Electrical Engineering, Mathematics and Computer ScienceUniversity TwenteEnschedeThe Netherlands

Personalised recommendations