Detecting and Addressing Design Smells in Novice Processing Programs
- 150 Downloads
Many novice programmers are able to write code that solves a given problem, but they struggled to write code that adheres to basic principles of good application design. Their programs will contain several design smells which indicate a lack of understanding of how to structure code. This applies in particular to degrees in which programming, and by extension software design, is only a small part of the curriculum.
This paper defines design smells for Processing, a language for new media and visual arts that is based on Java. This includes language specific smells that arise from the common structure that all Processing programs share. The paper also describes how to detect those smells automatically with static analysis. This tool is meant to support teaching staff with providing feedback to novices on program design.
We applied the tool to a large set of student programs, as well as programs from the Processing community, and code examples used by textbooks and instructors. The latter gave a good sense of the quality of resources that students use for reference. We found that a surprising number of resources contains at least some design smell. The paper then describes how to refactor the code to avoid these smells. These guidelines are meant to be practical and fitting the concepts and constructs that are known to first-year students.
- 1.Hermans, F., Aivaloglou, E.: Do code smells hamper novice programming? A controlled experiment on scratch programs. In: ICPC 2016, pp. 1–10 (2016)Google Scholar
- 2.Suryanarayana, G., Samarthyam, G., Sharma, T.: Refactoring for Software Design Smells: Managing Technical Debt, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco (2014)Google Scholar
- 3.Reas, C., Fry, B.: Processing: A Programming Handbook for Visual Designers and Artists. The MIT Press, Cambridge (2007)Google Scholar
- 4.Man, R., Fehnker, A.: The smell of processing. In: Proceedings of the 10th International Conference on Computer Supported Education, vol. 2 (2018)Google Scholar
- 5.Stegeman, M., Barendsen, E., Smetsers, S.: Towards an empirically validated model for assessment of code quality. In: Koli Calling 2014, pp. 99–108. ACM, New York (2014)Google Scholar
- 6.Hermans, F., Stolee, K.T., Hoepelman, D.: Smells in block-based programming languages. In: 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) (2016)Google Scholar
- 7.Aivaloglou, E., Hermans, F.: How kids code and how we know: an exploratory study on the scratch repository. In: ICER 2016. ACM, New York (2016)Google Scholar
- 9.Blok, T., Fehnker, A.: Automated program analysis for novice programmers. In: HEAd 2017, Universitat Politecnica de Valencia (2016)Google Scholar
- 10.Keuning, H., Jeuring, J., Heeren, B.: Towards a systematic review of automated feedback generation for programming exercises. In: ITiCSE 2016. ACM, New York (2016)Google Scholar
- 12.Lelli, V., Blouin, A., Baudry, B., Coulon, F., Beaudoux, O.: Automatic detection of GUI design smells: the case of Blob listener. In: Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, EICS 2016. ACM (2016)Google Scholar
- 13.Goetz, B.: Java Concurrency in Practice. Addison-Wesley, Upper Saddle River (2006)Google Scholar
- 16.de Man, R.: The smell of poor design. In: 26th Twente Student Conference on IT, University of Twente (2017)Google Scholar
- 17.Shiffman, D.: Learning Processing: A Beginner’s Guide to Programming Images, Animation, and Interaction, 2nd edn. Morgan Kaufmann Publishers Inc., San Francisco (2016)Google Scholar
- 18.Okun, V., Delaitre, A., Black, P.E.: Report on the static analysis tool exposition (SATE) IV. NIST Spec. Publ. 500, 297 (2013)Google Scholar
- 20.Perevalov, D., Tatarnikov, I.S.: openFrameworks Essentials. Packt Publishing, Birmingham (2015)Google Scholar