Applying static code analysis for domain-specific languages

Abstract

The use of code quality control platforms for analysing source code is increasingly gaining attention in the developer community. These platforms are prepared to parse and check source code written in a variety of general-purpose programming languages. The emergence of domain-specific languages enables professionals from different areas to develop and describe problem solutions in their disciplines. Thus, source code quality analysis methods and tools can also be applied to software artefacts developed with a domain-specific language. To evaluate the quality of domain-specific language code, every software component required by the quality platform to parse and query the source code must be developed. This becomes a time-consuming and error-prone task, for which this paper describes a model-driven interoperability strategy that bridges the gap between the grammar formats of source code quality parsers and domain-specific text languages. This approach has been tested on the most widespread platforms for designing text-based languages and source code analysis. This interoperability approach has been evaluated on a number of specific contexts in different domain areas.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Notes

  1. 1.

    https://www.sonarqube.org.

  2. 2.

    https://www.codacy.com.

  3. 3.

    http://www.squoring.com.

  4. 4.

    http://www.eclipse.org/Xtext.

  5. 5.

    https://github.com.

  6. 6.

    https://www.jetbrains.com/mps.

  7. 7.

    http://www.rascal-mpl.org.

  8. 8.

    http://www.monticore.de.

  9. 9.

    http://www.metaborg.org.

  10. 10.

    http://docs.sonarqube.org/display/DEV/SSLR.

  11. 11.

    http://www.eclipse.org/acceleo.

  12. 12.

    https://github.com/TatyPerson/Xtext2Sonar.

  13. 13.

    https://maven.apache.org.

  14. 14.

    http://tatyperson.github.io/Vary/.

  15. 15.

    http://lilypond.org/.

  16. 16.

    http://sculptorgenerator.org.

  17. 17.

    http://www.tango-controls.org.

  18. 18.

    http://www.eclipse.org/smarthome.

  19. 19.

    https://goo.gl/ju1zwd.

  20. 20.

    https://goo.gl/xsPzgY.

  21. 21.

    https://goo.gl/17pGq9.

  22. 22.

    https://goo.gl/dkZB3j.

  23. 23.

    http://vedilsanalytics.uca.es/sonarqube/.

References

  1. 1.

    Abran, A., Khelifi, A., Suryn, W., Seffah, A.: Usability meanings and interpretations in iso standards. Softw. Qual. J. 11(4), 325–338 (2003)

    Article  Google Scholar 

  2. 2.

    Ameller, D., Franch, X.J.: Dealing with non-functional requirements in model-driven development. In: 2010 18th IEEE international requirements engineering conference, pp. 189–198 (2010)

  3. 3.

    Ampatzoglou, A., Ampatzoglou, A., Chatzigeorgiou, A., Avgeriou, P.: The financial aspect of managing technical debt: a systematic literature review. Inf. Softw. Technol. 64, 52–73 (2015)

    Article  Google Scholar 

  4. 4.

    Besova, G., Steenken, D., Wehrheim, H.: Grammar-based model transformations: definition, execution, and quality properties. Comput. Lang. Syst. Struct. 43, 116–138 (2015)

    MATH  Google Scholar 

  5. 5.

    Bettini, L.: Implementing Domain-Specific Languages with Xtext and Xtend. Packt Publishing Ltd, Birmingham (2013)

    Google Scholar 

  6. 6.

    Boud, D.: Sustainable assessment: rethinking assessment for the learning society. Stud. Contin. Educ. 22(2), 151–167 (2000)

    Article  Google Scholar 

  7. 7.

    Brambilla, M., Cabot, J., Wimmer, M.: Model-driven software engineering in practice. Synth. Lect. Softw. Eng. 1(1), 1–182 (2012)

    Article  Google Scholar 

  8. 8.

    Brown, N.C.C., Altadmri, A.: Novice java programming mistakes: large-scale data vs. educator beliefs. ACM Trans. Comput. Educ. (TOCE) 17(2), 7:1–7:21 (2017). https://doi.org/10.1145/2994154

    Article  Google Scholar 

  9. 9.

    Davies, S.: Effective assessment in a digital age. URL: http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassass_eada.pdf (2010). Accessed 15 Oct 2013

  10. 10.

    Ford, B.: Parsing expression grammars: a recognition-based syntactic foundation. In: Proceedings of the 31st ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, POPL ’04, pp. 111–122. ACM, New York, NY, USA (2004). https://doi.org/10.1145/964001.964011

  11. 11.

    Fowler, M.: Domain-Specific Languages. Pearson Education, London (2010)

    Google Scholar 

  12. 12.

    Gamma, E.: Design Patterns: Elements of Reusable Object-Oriented Software. Pearson Education India, Bangalore (1995)

    Google Scholar 

  13. 13.

    Gould, E.: Behind Bars: The Definitive Guide to Music Notation. Faber Music, London (2011)

    Google Scholar 

  14. 14.

    Heinze, T.S., Amme, W., Moser, S.: Static analysis and process model transformation for an advanced business process to petri net mapping. Softw. Pract. Exp. 48(1), 161–195 (2018)

    Article  Google Scholar 

  15. 15.

    Hermans, F., Pinzger, M., Deursen, A.: Domain-specific languages in practice: a user study on the success factors. In: Model Driven Engineering Languages and Systems: 12th International Conference, pp. 423–437. Springer, Berlin (2009)

    Google Scholar 

  16. 16.

    Hermans, F., Stolee, K.T., Hoepelman, D.: Smells in block-based programming languages. In: 2016 IEEE Symposium on Visual Languages and Human-Centric Computing, VL/HCC 2016, Cambridge, United Kingdom, September 4–8, 2016, pp. 68–72 (2016)

  17. 17.

    Hevner, A., Chatterjee, S.: Design Research in Information Systems: Theory and Practice, vol. 22. Springer, Berlin (2010)

    Google Scholar 

  18. 18.

    ISO/IEC: 14977: Information technology—Syntactic metalanguage—Extended BNF. Standard, International Organization for Standardization (1996)

  19. 19.

    ISO/IEC: 25010: Systems and software engineering—systems and software quality requirements and evaluation (SQuaRE)—System and software quality models. Tech. rep., International Organization for Standardization (2010)

  20. 20.

    Jouault, F., Allilaire, F., Bézivin, J., Kurtev, I.: ATL: A model transformation tool. Sci. Comput. Program. 72(1), 31–39 (2008)

    MathSciNet  Article  Google Scholar 

  21. 21.

    Kappel, G., Langer, P., Retschitzegger, W., Schwinger, W., Wimmer, M.: Model transformation by-example: a survey of the first wave. In: Düsterhöft, A., Klettke, M., Schewe, K.-D. (eds.) Conceptual Modelling and Its Theoretical Foundations, pp. 197–215. Springer (2012)

  22. 22.

    Kruchten, P., Nord, R.L., Ozkaya, I.: Technical debt: from metaphor to theory and practice. IEEE Softw. 29(6), 18–21 (2012)

    Article  Google Scholar 

  23. 23.

    Letouzey, J.L.: The SQALE method for evaluating technical debt. In: Managing Technical Debt (MTD), 2012 Third International Workshop on, pp. 31–36. IEEE (2012)

  24. 24.

    Mandal, A., Mohan, D., Jetley, R., Nair, S., D’Souza, M.: A generic static analysis framework for domain-specific languages. In: 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), vol. 1, pp. 27–34 (2018). https://doi.org/10.1109/ETFA.2018.8502576

  25. 25.

    Mernik, M., Heering, J., Sloane, A.M.: When and how to develop domain-specific languages. ACM Comput. Surv. 37(4), 316–344 (2005)

    Article  Google Scholar 

  26. 26.

    Nienhuys, H.W., Nieuwenhuizen, J.: Lilypond, a system for automated music engraving. In: Proceedings of the XIV Colloquium on Musical Informatics (XIV CIM 2003), vol. 1, pp. 167–172 (2003)

  27. 27.

    Parr, T., Fisher, K.: LL (*): the foundation of the ANTLR parser generator. In: ACM SIGPLAN Notices, vol. 46, pp. 425–436. ACM (2011)

  28. 28.

    Prähofer, H., Angerer, F., Ramler, R., Lacheiner, H., Grillenberger, F.: Opportunities and challenges of static code analysis of iec 61131-3 programs. In: Proceedings of 2012 IEEE 17th International Conference on Emerging Technologies Factory Automation (ETFA 2012), pp. 1–8 (2012). https://doi.org/10.1109/ETFA.2012.6489535

  29. 29.

    Redziejowski, R.R.: From ebnf to peg. Fundam. Inform. 128(1–2), 177–191 (2013)

    MathSciNet  Article  Google Scholar 

  30. 30.

    Rochimah, S., Arifiani, S., Insanittaqwa, V.F.: Non-source code refactoring: a systematic literature review. Int. J. Softw. Eng. Appl. 9(6), 197–214 (2015)

    Google Scholar 

  31. 31.

    Rose, K.: Project Quality Management: Why, What and How, 2nd edn. J. Ross Publishing, USA (2005)

    Google Scholar 

  32. 32.

    Rubin, J., Chisnell, D.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley, Hoboken (2008)

    Google Scholar 

  33. 33.

    Ruiz-Rube, I., Person, T., Dodero, J.M.: Static analysis of textual models. In: Jornadas de Ingeniería del Software y Bases de Datos (JISBD) (2016)

  34. 34.

    Saad, C., Bauer, B.: Data-flow based model analysis and its applications. In: Moreira, A., Schätz, B., Gray, J., Vallecillo, A., Clarke, P. (eds.) Model-Driven Engineering Languages and Systems, pp. 707–723. Springer, Berlin, Heidelberg (2013)

    Google Scholar 

  35. 35.

    Shambaugh, R., Weiss, A., Guha, A.: Rehearsal: a configuration verification tool for puppet. In: Proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI ’16, pp. 416–430. ACM, New York, NY, USA (2016). https://doi.org/10.1145/2908080.2908083

  36. 36.

    Sharma, T., Fragkoulis, M., Spinellis, D.: Does your configuration code smell? In: Proceedings of the 13th International Conference on Mining Software Repositories, MSR ’16, pp. 189–200. ACM, New York, NY, USA (2016). https://doi.org/10.1145/2901739.2901761

  37. 37.

    Stevenson, A., Cordy, J.R.: A survey of grammatical inference in software engineering. Sci. Comput. Program. 96, 444–459 (2014)

    Article  Google Scholar 

  38. 38.

    Syriani, E., Gray, J.: Challenges for addressing quality factors in model transformation. In: 2012 IEEE Fifth International Conference on Software Testing, Verification and Validation, pp. 929–937 (2012)

  39. 39.

    Tomas, P., Escalona, M., Mejias, M.: Open source tools for measuring the internal quality of java software products. A survey. Comput. Stand. Interfaces 36(1), 244–255 (2013)

    Article  Google Scholar 

Download references

Acknowledgements

This work has been developed in the VISAIGLE project, funded by the Spanish National Research Agency (AEI) with ERDF funds under grant ref. TIN2017-85797-R.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Iván Ruiz-Rube.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by Prof. Tony Clark.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ruiz-Rube, I., Person, T., Dodero, J.M. et al. Applying static code analysis for domain-specific languages. Softw Syst Model 19, 95–110 (2020). https://doi.org/10.1007/s10270-019-00729-w

Download citation

Keywords

  • Text-based languages
  • Static analysis
  • Model-driven interoperability
  • Xtext
  • SonarQube