Empirical Software Engineering

, Volume 17, Issue 3, pp 276–304 | Cite as

Program comprehension of domain-specific and general-purpose languages: comparison using a family of experiments

Article

Abstract

Domain-specific languages (DSLs) are often argued to have a simpler notation than general-purpose languages (GPLs), since the notation is adapted to the specific problem domain. Consequently, the impact of domain relevance on the creation of the problem representation is believed to improve programmers’ efficiency and accuracy when using DSLs compared with using similar solutions like application libraries in GPLs. Most of the common beliefs have been based upon qualitative conclusions drawn by developers. Rather than implementing the same problem in a DSL and in a GPL and comparing the efficiency and accuracy of each approach, developers often compare the implementation of a new program in a DSL to their previous experiences implementing similar programs in GPLs. Such a conclusion may or may not be valid. This paper takes a more skeptical approach to acceptance of those beliefs. By reporting on a family of three empirical studies comparing DSLs and GPLs in different domains. The results of the studies showed that when using a DSL, developers are more accurate and more efficient in program comprehension than when using a GPL. These results validate some of the long- held beliefs of the DSL community that until now were only supported by anecdotal evidence.

Keywords

Domain-specific languages General-purpose languages Program understanding Program comprehension Controlled experiments Language evaluations 

Notes

Acknowledgements

Special thanks to Matej Črepinšek and Dainela da Cruz who helped us prepare the third experiment on graphical user interfaces. Also thanks to Portuguese partners (Pedro Rangel Henriques, Maria João Varanda Pereira, and Nuno Oliveira) for their constructive comments on questionnaires that helped us increase the quality of this work.

References

  1. Basili V, Shull F, Lanubile F (1999) Building knowledge through families of experiments. IEEE Trans Softw Eng 25(4):456–473CrossRefGoogle Scholar
  2. Bentley J (1986) Little languages. Commun ACM 29(8):711–721MathSciNetCrossRefGoogle Scholar
  3. Carver J, Jaccheri L, Morasca S, Shull F (2003) Issues in using students in empirical studies in software engineering education. In: METRICS ’03: proceedings of the 9th international symposium on software metrics. IEEE Computer Society, Washington, DC, USA, p 239Google Scholar
  4. Carver J, Jaccheri L, Morasca S, Shull F (2010) A checklist for integrating student empirical studies with research and teaching goals. Empir Softw Eng 15(1):35–59CrossRefGoogle Scholar
  5. Consel C, Marlet R (1998) Architecturing software using a methodology for language development. In: Proceedings of the 10th international symposium on programming language implementation and logic programming, vol 1490, pp 170–194Google Scholar
  6. Cruz-Lemus JA, Genero M, Manso ME, Morasca S, Piattini M (2009) Assessing the understandability of UML statechart diagrams with composite states–a family of empirical studies. Empir Softw Eng 14(6):685–719CrossRefGoogle Scholar
  7. Czarnecki K, Eisenecker U (2000) Generative programming: methods, tools and applications. Addison-Wesley, ReadingGoogle Scholar
  8. van Deursen A, Klint P (1998) Little languages: little maintenance. J Softw Maint 10(2):75–92CrossRefGoogle Scholar
  9. van Deursen A, Klint P (2002) Domain-specific language design requires feature descriptions. J Comput Inf Technol 10(1):1–17MATHCrossRefGoogle Scholar
  10. van Deursen A, Klint P, Visser J (2000) Domain-specific languages: an annotated bibliography. ACM SIGPLAN Not 35(6):26–36CrossRefGoogle Scholar
  11. Elliott C (1999) An embedded modeling language approach to interactive 3D and multimedia animation. IEEE Trans Softw Eng 25(3):291–309CrossRefGoogle Scholar
  12. Gansner ER, Koutsofios E, North S (2009) Drawing graphs with dot. Tech Rep AT&T Bell Laboratories, Murray Hill, NJ, USA. http://www.graphviz.org/pdf/dotguide.pdf
  13. Green TRG, Petre M (1996) Usability analysis of visual programming environments: a ‘cognitive dimensions’ framework. J Vis Lang Comput 7(2):131–174CrossRefGoogle Scholar
  14. Hevner AR, Linger RC, Webb Collins R, Pleszkoch M, Prowell S, Walton G (2005) The impact of function extraction technology on next-generation software engineering. Tech Rep CMU/SEI-90-TR-21, Software Engineering Institute, Carnegie Mellon UniversityGoogle Scholar
  15. Hudak P (1998) Modular domain specific languages and tools. In: Proceedings: fifth international conference on software reuse. IEEE Computer Society Press, Los Alamitos, pp 134–142Google Scholar
  16. Jedlitschka A, Ciolkowski M, Pfahl D (2008) Reporting experiments in software engineering. In: Shull F, Singer J, Sjæberg DIK (eds) Guide to advanced empirical software engineering. Springer, London, pp 201–228. doi:10.1007/978-1-84800-044-5_8 CrossRefGoogle Scholar
  17. Kang K, Cohen S, Hess J, Novak W, Peterson S (1990) Feature-oriented domain analysis (FODA) feasibility study. Tech Rep CMU/SEI-90-TR-21, Software Engineering Institute, Carnegie Mellon UniversityGoogle Scholar
  18. Kieburtz RB, McKinney L, Bell JM, Hook J, Kotov A, Lewis J, Oliva DP, Sheard T, Smith I, Walton L (1996) A software engineering experiment in software component generation. In: ICSE-18: proceedings of the 18th international conference on software engineering. IEEE Computer Society, pp 542–553Google Scholar
  19. Kosar T, Martínez López PE, Barrientos PA, Mernik M (2008) A preliminary study on various implementation approaches of domain-specific language. Inf Softw Technol 50(5):390–405CrossRefGoogle Scholar
  20. Kosar T, Mernik M, Črepinšek M, Henriques PR, da Cruz D, Varanda Pereira MJ, Oliveira N (2009) Influence of domain-specific notation to program understanding. In: Proceedings of the international multiconference on computer science and information technology, WAPL 2009—2nd workshop on advances in programming languages, pp 675–682Google Scholar
  21. Kosar T, Mernik M, Črepinšek M, Henriques PR, da Cruz D, Varanda Pereira MJ, Oliveira N (2010) Comparing general-purpose and domain-specific languages: an empirical study. Comput Sci Inf Syst 7(2):247–264 (Extended version of CORTA’09 paper: comparison of XAML and C# forms using cognitive dimension framework)CrossRefGoogle Scholar
  22. Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 22(140):1–55Google Scholar
  23. MacVittie LA (2006) XAML in a nutshell. O’Reilly Media, IncGoogle Scholar
  24. Mauw S, Wiersma W, Willemse T (2004) Language-driven system design. Int J Softw Eng Knowl Eng 6(14):625–664CrossRefGoogle Scholar
  25. Meijer E, Beckman B, Bierman G (2006) Linq: reconciling object, relations and xml in the .net framework. In: Proceedings of the 2006 ACM SIGMOD international conference on management of data. ACM, New York, NY, USA, pp 706–706CrossRefGoogle Scholar
  26. Mernik M, Heering J, Sloane A (2005) When and how to develop domain-specific languages. ACM Comput Surv 37(4):316–344CrossRefGoogle Scholar
  27. Nugroho A (2009) Level of detail in UML models and its impact on model comprehension: a controlled experiment. Inf Softw Technol 51(12):1670–1685CrossRefGoogle Scholar
  28. Otero MC, Dolado JJ (2004) Evaluation of the comprehension of the dynamic modeling in UML. Inf Softw Technol 46(1):35–53CrossRefGoogle Scholar
  29. Peyton Jones S, Blackwell A, Burnett M (2003) A user-centred approach to functions in Excel. In: ICFP ’03: proceedings of the eighth ACM SIGPLAN international conference on functional programming. ACM Press, New York, NY, USA, pp 165–176CrossRefGoogle Scholar
  30. Ricca F, Scanniello G, Torchiano M, Reggio G, Astesiano E (2010) On the effectiveness of screen mockups in requirements engineering: results from an internal replication. In: Proceedings of the 2010 ACM-IEEE international symposium on empirical software engineering and measurement (ESEM 2010). ACM, New York, NY, USA, pp 17:1–17:10CrossRefGoogle Scholar
  31. Sjoeberg D, Hannay J, Hansen O, Kampenes V, Karahasanovic A, Liborg NK, Rekdal A (2005) A survey of controlled experiments in software engineering. IEEE Trans Softw Eng 31(9):733–753CrossRefGoogle Scholar
  32. Sprinkle J, Mernik M, Tolvanen JP, Spinellis D (2009) What kinds of nails need a domain-specific hammer? IEEE Softw 26(4):15–18CrossRefGoogle Scholar
  33. Storey MA (2005) Theories, methods and tools in program comprehension: past, present and future. In: IWPC ’05: proceedings of the 13th international workshop on program comprehension. IEEE Computer Society, pp 181–191Google Scholar
  34. Thibault S, Marlet R, Consel C (1999) Domain-specific languages: from design to implementation—application to video device drivers generation. IEEE Trans Softw Eng 25(3):363–377CrossRefGoogle Scholar
  35. Varanda Pereira MJ, Mernik M, da Cruz D, Henriques PR (2008) Program comprehension for domain-specific languages. Comput Sci Inf Syst 5(2):1–17CrossRefGoogle Scholar
  36. Webb Collins R, Hevner AR, Walton GH, Linger RC (2008) The impacts of function extraction technology on program comprehension: a controlled experiment. Inf Softw Technol 50(11): 1165–1179CrossRefGoogle Scholar
  37. Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83CrossRefGoogle Scholar
  38. Wile DS (2001) Supporting the DSL spectrum. J Comput Inf Technol 9(4):263–287MATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  • Tomaž Kosar
    • 1
  • Marjan Mernik
    • 1
  • Jeffrey C. Carver
    • 2
  1. 1.Faculty of Electrical Engineering and Computer ScienceUniversity of MariborMariborSlovenia
  2. 2.Department of Computer ScienceUniversity of AlabamaTuscaloosaUSA

Personalised recommendations