Software & Systems Modeling

, Volume 10, Issue 4, pp 537–552 | Cite as

Model-driven generative development of measurement software

  • Martin Monperrus
  • Jean-Marc Jézéquel
  • Benoit Baudry
  • Joël Champeau
  • Brigitte Hoeltzener
Regular Paper


Metrics offer a practical approach to evaluate properties of domain-specific models. However, it is costly to develop and maintain measurement software for each domain-specific modeling language. In this paper, we present a model-driven and generative approach to measuring models. The approach is completely domain-independent and operationalized through a prototype that synthesizes a measurement infrastructure for a domain-specific modeling language. This model-driven measurement approach is model-driven from two viewpoints: (1) it measures models of a domain-specific modeling language; (2) it uses models as unique and consistent metric specifications, with respect to a metric specification metamodel which captures all the necessary concepts for model-driven specifications of metrics. The benefit from applying the approach is evaluated by four case studies. They indicate that this approach significantly eases the measurement activities of model-driven development processes.


Software Metrics Application Case Root Element Capability Maturity Model Domain Metrics 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Baroni, A., Braz, S., Abreu, F.L.: Using OCL to formalize object-oriented design metrics definitions. In: ECOOP’02 Workshop on Quantitative Approaches in OO Software Engineering (2002)Google Scholar
  2. 2.
    Basili Victor R., Briand Lionel C., Melo Walcélio L.: A validation of object-oriented design metrics as quality indicators. IEEE Trans. Softw. Eng. 22(10), 751–761 (1996)CrossRefGoogle Scholar
  3. 3.
    Basili, V.R., Caldiera, G., Rombach, H.D.: The goal question metric approach. In: Encyclopedia of Software Engineering. Wiley, New York (1994)Google Scholar
  4. 4.
    Baumert, J., McWhinney, M.: Software measures and the capability maturity model. Technical report, Software Engineering Institute, Carnegie Mellon University (1992)Google Scholar
  5. 5.
    Berenbach, B., Borotto, G.: Metrics for model driven requirements development. In: Proceeding of the 28th International Conference on Software Engineering (ICSE ’06), pp. 445–451. ACM Press, New York (2006)Google Scholar
  6. 6.
    Budinsky F., Steinberg D., Merks E., Ellersick R., Grose T.J.: Eclipse Modeling Framework. Addison-Wesley, Reading (2004)Google Scholar
  7. 7.
    Chidamber, S.R., Kemerer, C.F.: Towards a metrics suite for object-oriented design. In: Proceedings of OOPSLA’91, pp. 197–211 (1991)Google Scholar
  8. 8.
    Coleman D., Ash D., Lowther B., Oman P.: Using metrics to evaluate software system maintainability. IEEE Comput. 27(8), 44–49 (1994)Google Scholar
  9. 9.
    Costello R.J., Liu D.-B.: Metrics for requirements engineering. J. Syst. Softw. 29(1), 39–63 (1995)CrossRefGoogle Scholar
  10. 10.
    Davis, A., Overmyer, S., Jordan, K., Caruso, J., Dandashi, F., Dinh, A., Kincaid, G., Ledeboer, G., Reynolds, P., Sitaram, P., Ta, A., Theofanos, M.: Identifying and measuring quality in a software requirements specification. In: Proceedings of the First International Software Metrics Symposium (1993)Google Scholar
  11. 11.
    de Lara, J., Vangheluwe, H.: Atom3: a tool for multi- formalism and meta-modelling. In: Proceedings of the 5th International Conference on Fundamental Approaches to Software Engineering (FASE’02), pp. 174–188. Springer, Berlin (2002)Google Scholar
  12. 12.
    Douglass, B.P.: Computing model complexity. White paper, I-Logix (2004)Google Scholar
  13. 13.
    El Wakil, M., El Bastawissi, A., Boshra, M., Fahmy, A.: A novel approach to formalize and collect object-oriented design-metrics. In: Proceedings of the 9th International Conference on Empirical Assessment in Software Engineering (2005)Google Scholar
  14. 14.
    Feiler, P.H., Gluch, D.P., Hudak, J.J.: The architecture analysis and design language (aadl): an introduction. Technical report, CMU/SEI (2006)Google Scholar
  15. 15.
    Fenton N.E.: Software Metrics: A Rigorous Approach. Chapman & Hall, London (1991)zbMATHGoogle Scholar
  16. 16.
    Garcìa F., Bertoa Manuel F., Calero C., Vallecillo A., Ruiz F., Piattini M., Genero M.: Towards a consistent terminology for software measurement. Inf. Softw. Technol. 48(8), 631–644 (2006)CrossRefGoogle Scholar
  17. 17.
    García F., Ruiz F., Calero C., Bertoa Manuel F., Vallecillo A., Mora B., Piattini M.: Effective use of ontologies in software measurement. Knowl. Eng. Rev. 24, 23–40 (2009)CrossRefGoogle Scholar
  18. 18.
    Garcia F., Serrano M., Cruz-Lemus J., Ruiz F., Piattini M.: Managing software process measurement: a metamodel-based approach. Inf. Sci. 177(12), 2570–2586 (2007)CrossRefGoogle Scholar
  19. 19.
    Guerra E., de Lara J., Díaz P.: Visual specification of measurements and redesigns for domain specific visual languages. J. Vis. Lang. Comput. 19, 1–27 (2008)CrossRefGoogle Scholar
  20. 20.
    Hammer, T., Rosenberg, L., Huffman, L., Hyatt, L.: Requirements metrics—value added. In: Proceedings of the 3rd IEEE International Symposium on Requirements Engineering (RE’97), p. 141.1. IEEE Computer Society, Washington, DC (1997)Google Scholar
  21. 21.
    Harmer, T.J., Wilkie, F.G.: An extensible metrics extraction environment for object-oriented programming languages. In: Proceedings of the International Conference on Software Maintenance (2002)Google Scholar
  22. 22.
    Henderson-Sellers B.: Object-Oriented Metrics, measures of complexity. Prentice-Hall, Englewood Cliffs (1996)Google Scholar
  23. 23.
    Henderson-Sellers, B., Zowghi, D., Klemola, T., Parasuram, S.: Sizing use cases: how to create a standard metrical approach. In: Proceedings of the 8th International Conference on Object- Oriented Information Systems (OOIS ’02), pp. 409–421. Springer, Berlin (2002)Google Scholar
  24. 24.
    Kolde, C.: Basic metrics for requirements management. White paper, Borland (2004)Google Scholar
  25. 25.
    Lédeczi Á., Bakay Á., Maroti M., Vőlgyesi P., Nordstrom G., Sprinkle J., Karsai G.: Composing domain-specific design environments. IEEE Comput. 34, 44–51 (2001)Google Scholar
  26. 26.
    Loconsole, A.: Measuring the requirements management key process area. In: Proceedings of the 12th European Software Control and Metrics Conference (ESCOM’2001) (2001)Google Scholar
  27. 27.
    Marchesi, M.: OOA metrics for the Unified Modeling Language. In: Proceedings of the 2nd Euromicro Conference on Software Maintenance and Reengineering (CSMR’98), p. 67. IEEE Computer Society (1998)Google Scholar
  28. 28.
    Marinescu, C., Marinescu, R., Gîrba, T.: Towards a simplified implementation of object-oriented design metrics. In: IEEE METRICS p. 11 (2005)Google Scholar
  29. 29.
    McQuillan, J., Power, J.: A definition of the chidamber and kemerer metrics suite for uml. Technical report, National University of Ireland (2006)Google Scholar
  30. 30.
    McQuillan, J.A., Power, J.F.: Experiences of using the dagstuhl middle metamodel for defining software metrics. In: Proceedings of the 4th International Conference on Principles and Practices of Programming in Java (2006)Google Scholar
  31. 31.
    Mens T., Lanza M.: A graph-based metamodel for object-oriented software metrics. Electron. Notes Theor. Comput. Sci. 72, 57–68 (2002)CrossRefGoogle Scholar
  32. 32.
    Misic, V.B., Moser, S.: From formal metamodels to metrics: an object-oriented approach. In: Proceedings of the Technology of Object-Oriented Languages and Systems Conference (TOOLS’97), pp. 330 (1997)Google Scholar
  33. 33.
    Monperrus, M.: La mesure des modéles par les modèles: une approche générative. Ph.D. thesis, Université de Rennes (2008)Google Scholar
  34. 34.
    Monperrus, M., Champeau, J., Hoeltzener, B.: Counts count. In: Proceedings of the 2nd Workshop on Model Size Metrics (MSM’07) co-located with MoDELS’2007 (2007)Google Scholar
  35. 35.
    Monperrus, M., Jaozafy, F., Marchalot, G., Champeau, J., Hoeltzener, B., Jézéquel, J.-M.: Model-driven simulation of a maritime surveillance system. In: Proceedings of the 4th European Conference on Model Driven Architecture Foundations and Applications (ECMDA’2008) (2008)Google Scholar
  36. 36.
    Monperrus, M., Jézéquel, J.-M., Champeau, J., Hoeltzener, B.: Measuring models. In: Jörg, R., Christian, B. (eds.) Model-Driven Software Development: Integrating Quality Assurance. IDEA Group (2008)Google Scholar
  37. 37.
    Monperrus, M., Long, B., Champeau, J., Hoeltzener, B., Marchalot, G., Jézéquel, J.-M.: Model-driven architecture of a maritime surveillance system simulator. Syst. Eng. 13, (2010)Google Scholar
  38. 38.
    Mora, B., Garcia, F., Ruiz, F., Piattini, M., Boronat, A., Gómez, A., Carsí, J.Á., Ramos, I.: Software measurement by using qvt transformations in an mda context. In: Proceedings of the International Conference on Enterprise Information Systems (ICEIS’2008) (2008)Google Scholar
  39. 39.
    Mora, B., Piattini, M., Ruiz, F., Garcia, F.: Smml: software measurement modeling language. In: Proceedings of the 8th Workshop on Domain-Specific Modeling (DSM’2008) (2008)Google Scholar
  40. 40.
    Nagappan, N., Ball, T., Zeller, A.: Mining metrics to predict component failures. In: Proceedings of the 28th International Conference on Software Engineering, pp. 452–461. ACM, New York (2006)Google Scholar
  41. 41.
    OMG: MOF 2.0 specification. Technical report, Object Management Group (2004)Google Scholar
  42. 42.
    OMG: Rfp for a software metrics meta-model (adtmf/2006-09-03). Technical report, OMG (2006)Google Scholar
  43. 43.
    Paulk, M.C., Weber, C.V., Garcia, S.M., Chrissis, M.B., Bush, M.: Key practices of the capability maturity model. Technical report, Software Engineering Institute (1993)Google Scholar
  44. 44.
    Pawlak, R., Noguera, C., Petitprez, N.: Spoon: program analysis and transformation in java. Technical Report 5901, INRIA (2006)Google Scholar
  45. 45.
    Reissing, R.: Towards a model for object-oriented design measurement. In: ECOOP’01 Workshop QAOOSE (2001)Google Scholar
  46. 46.
    Reynoso, L., Genero, M., Cruz-Lemuz J., Piattini, M.: OCL 2: using OCL in the formal definition of OCL expression measures. In: Proceedings of the 1st Workshop on Quality in Modeling (QIM’2006) (2006)Google Scholar
  47. 47.
    SAE: AADL Standard. Technical report, Society of Automotive Engineers (2006)Google Scholar
  48. 48.
    Schmidt D.C.: Model-driven engineering. IEEE Comput. 39(2), 25–31 (2006)Google Scholar
  49. 49.
    Singh Y., Sabharwal S., Sood M.: A systematic approach to measure the problem complexity of software requirement specifications of an information system. Inf. Manage. Sci. 15, 69–90 (2004)zbMATHGoogle Scholar
  50. 50.
    Staron M., Meding W., Nilsson C.: A framework for developing measurement systems and its industrial evaluation. Inf. Softw. Technol. 51, 721–737 (2009)CrossRefGoogle Scholar
  51. 51.
    Sztipanovits, J.: Advances in model-integrated computing. In: Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference (IMTC’2001), vol. 3, pp. 1660–1664 (2001)Google Scholar
  52. 52.
    Tang, M.-H., Chen, M.-H.: Measuring OO design metrics from UML. In: Proceedings of MODELS/UML’2002. UML 2002 (2002)Google Scholar
  53. 53.
    Vépa, E., Bézivin, J., Brunelière, H., Jouault, F.: Measuring model repositories. In: Proceedings of the 1st Workshop on Model Size Metrics (MSM’06) co-located with MoDELS’2006 (2006)Google Scholar
  54. 54.
    Zave P.: Classification of research efforts in requirements engineering. ACM Comput. Surv. 29(4), 315–321 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2010

Authors and Affiliations

  • Martin Monperrus
    • 1
  • Jean-Marc Jézéquel
    • 2
    • 3
  • Benoit Baudry
    • 3
  • Joël Champeau
    • 4
  • Brigitte Hoeltzener
    • 4
  1. 1.Technische Uniersität DarmstadtDarmstadtGermany
  2. 2.University of RennesRennesFrance
  3. 3.INRIARennesFrance
  4. 4.ENSIETABrestFrance

Personalised recommendations