Service Oriented Computing and Applications

, Volume 10, Issue 4, pp 391–411 | Cite as

Portability of executable service-oriented processes: metrics and validation

  • Jörg LenhardEmail author
  • Guido Wirtz
Original Research Paper


A key promise of process languages based on open standards, such as the Web Services Business Process Execution Language, is the avoidance of vendor lock-in through the portability of processes among runtime environments. Despite the fact that today various runtimes claim to support this language, every runtime implements a different subset, thus hampering portability and locking in their users. It is our intention to improve this situation by enabling the measurement of the portability of executable service-oriented processes. This helps developers to assess their implementations and to decide if it is feasible to invest in the effort of porting a process to another runtime. In this paper, we define several software quality metrics that quantify the degree of portability of an executable, service-oriented process from different viewpoints. When integrated into a development environment, such metrics can help to improve the portability of the outcome. We validate the metrics theoretically with respect to measurement theory and construct validity using two validation frameworks. The validation is complemented with an empirical evaluation of the metrics using a large set of processes coming from several process libraries.


Portability Software quality Metrics Process SOA 


  1. 1.
    Athanasopoulos G, Tsalgatidou A, Pantazoglou M (2006) Interoperability among heterogeneous services. In: International conference on services computing, Chicago, USAGoogle Scholar
  2. 2.
    Basci D, Misra S (2009) Measuring and evaluating a design complexity metric for XML schema documents. J Inf Sci Eng 25(5):1405–1425Google Scholar
  3. 3.
    Bianculli D, Binder W, Drago ML (2010) Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: International conference on the World Wide Web, pp 141–150, Raleigh, NC, USAGoogle Scholar
  4. 4.
    Boehm B, Brown J, Lipow M (1976) Quantitative evaluation of software quality. In: Proceedings of the 2nd international conference on software engineering, San Francisco, USAGoogle Scholar
  5. 5.
    Boehm BW, Abts C, Brown AW, Chulani S, Clark BK, Horowitz E, Madachy R, Reifer DJ, Steece B (2000) Software cost estimation with Cocomo II. Prentice Hall, Englewood Cliffs ISBN-13: 978-0130266927Google Scholar
  6. 6.
    Briand L, Morasca S, Basily V (1996) Property-based software engineering measurement. IEEE Trans Software Eng 22(1):68–86CrossRefGoogle Scholar
  7. 7.
    Cardoso J (2007) Business process quality metrics: log-based complexity of workflow patterns. In: On the move to meaningful internet systems 2007: CoopIS, DOA, ODBASE, GADA, and IS. Springer, pp 427–434Google Scholar
  8. 8.
    Cardoso J, Binz T, Breitenbücher U, Kopp O, Leymann F (2013) Cloud computing automation: integration USDL and TOSCA. In: 25th international conference on advanced information systems engineering, pp 1–16, Valencia, SpainGoogle Scholar
  9. 9.
    Cesari L, Lapadula A, Pugliese R, Tiezzi F (2010) A tool for rapid development of WS-BPEL applications. In: Proceedings of the 2010 ACM symposium on applied computing (SAC), Sierre, SwitzerlandGoogle Scholar
  10. 10.
    Emam KE, Benlarbi S, Goel N, Rai S (2001) The confounding effect of class size on the validity of object-oriented metrics. IEEE Trans Software Eng 27(7):630–650CrossRefGoogle Scholar
  11. 11.
    Geiger M, Harrer S, Lenhard J, Casar M, Vorndran A, Wirtz G (2015) BPMN conformance in open source engines. In: 9th IEEE Symposium on Service Oriented System Engineering, San Francisco Bay, CA, USAGoogle Scholar
  12. 12.
    Geiger M, Harrer S, Lenhard J, Wirtz G (2016) On the Evolution of BPMN 2.0 Support and Implementation. In: 10th IEEE symposium on service oriented system engineering, Oxford, UKGoogle Scholar
  13. 13.
    Geiger M, Wirtz G (2013) BPMN 2.0 serialization—standard compliance issues and evaluation of modeling tools. In: 5th international workshop on enterprise modelling and information systems architectures, St. Gallen, SwitzerlandGoogle Scholar
  14. 14.
    Geiger M, Wirtz G (2013) Detecting interoperability and correctness issues in BPMN 2.0 process models. In: 5th Central European workshop on services and their composition, Rostock, GermanyGoogle Scholar
  15. 15.
    Gilb T (1988) Principles of software engineering management. Addison Wesley, Reading. ISBN-13: 978-0201192469Google Scholar
  16. 16.
    Glinz M (2008) A risk-based, value-oriented approach to quality requirements. IEEE Comput 25(8):34–41Google Scholar
  17. 17.
    González LS, Rubio FG, González FR, Velthuis MP (2010) Measurement in business processes: a systematic review. Bus Process Manag J 16(91):114–134CrossRefGoogle Scholar
  18. 18.
    Hallwyl T, Henglein F, Hildebrandt T (2010) A standard-driven implementation of WS-BPEL 2.0. In: Proceedings of the 2010 ACM symposium on applied computing (SAC), Sierre, SwitzerlandGoogle Scholar
  19. 19.
    Harrer S, Lenhard J (2012) Betsy—a BPEL engine test system. Bamberger Beiträge zur Wirtschaftsinformatik und Angewandten Informatik, no. 90, University of Bamberg. Technical reportGoogle Scholar
  20. 20.
    Harrer S, Lenhard J, Wirtz G (2012) BPEL conformance in open source engines. In: IEEE international conference on service-oriented computing and applications, IEEE, Taipei, TaiwanGoogle Scholar
  21. 21.
    Harrer S, Lenhard J, Wirtz G (2013) Open source versus proprietary software in service-orientation: the case of BPEL engines. In: 11th international conference on service oriented computing (ICSOC), pp 99–113, Berlin, GermanyGoogle Scholar
  22. 22.
    Hinz S, Schmidt K, Stahl C (2005) Transforming BPEL to Petri nets. In: 3rd international conference on business process management, Nancy, FranceGoogle Scholar
  23. 23.
    Hofmeister H, Wirtz G (2008) Supporting service-oriented design with metrics. In: Proceedings of the 12th international IEEE enterprise distributed object computing conference, Munich, GermanyGoogle Scholar
  24. 24.
    Højsgaard E, Hallwyl T (2012) Core BPEL: syntactic simplification of WS-BPEL 2.0. In: Proceedings of the 27th annual ACM symposium on applied computing, pp 1984–1991. ACM, Trento, ItalyGoogle Scholar
  25. 25.
    IEEE (1998) IEEE Std 1061-1998 (R2009), IEEE standard for a software quality metrics methodology. Revision of IEEE Std 1061-1992Google Scholar
  26. 26.
    ISO/IEC (2003) Software engineering—product quality—Part 2: External metrics. 9126-2:2003Google Scholar
  27. 27.
    ISO/IEC (2003) Software engineering—product quality—Part 3: Internal metrics. 9126-3:2003Google Scholar
  28. 28.
    ISO/IEC (2011) Systems and software engineering—system and software quality requirements and evaluation (SQuaRE)—system and software quality models. 25010:2011Google Scholar
  29. 29.
    ISO/IEC (2013) Systems and software engineering—systems and software quality requirements and evaluation (SQuaRE)—measurement of system and software product quality. 25023Google Scholar
  30. 30.
    Kaner C, Bond W (2004) Software engineering metrics: what do they measure and how do we know? In: 10th international software metrics symposium, Chicago, USAGoogle Scholar
  31. 31.
    Kang H, Yang X, Yuan S (2007) Modeling and verification of web services composition based on CPN. In: IFIP international conference on network and parallel computing workshops, Dalian, ChinaGoogle Scholar
  32. 32.
    Khalaf R, Keller A, Leymann F (2006) Business processes for web services: principles and applications. IBM Syst J 45(2):425–446CrossRefGoogle Scholar
  33. 33.
    Kolb S, Wirtz G (2014) Towards application portability in platform as a service. In: 8th international symposium on service-oriented system engineering, Oxford, UKGoogle Scholar
  34. 34.
    Kopp O, Martin D, Wutke D, Leymann F (2009) The difference between graph-based and block-structured business process modelling languages. Enterp Model Inf Syst 4(1):3–13Google Scholar
  35. 35.
    Lapadula A, Pugliese R, Tiezzi F (2008) A formal account of WS-BPEL. In: Proceedings of the 10th international conference on coordination models and languages, Oslo, NorwayGoogle Scholar
  36. 36.
    Lübke D (2007) Unit testing BPEL compositions. In: Test and analysis of service-oriented systems. Springer, Berlin, pp 149–171. ISBN: 978-3540729112Google Scholar
  37. 37.
    Lenhard J, Geiger M, Wirtz G (2015) On the measurement of design-time adaptability for process-based systems. In: 9th international IEEE symposium on service-oriented system engineering (SOSE), San Francisco Bay, USAGoogle Scholar
  38. 38.
    Lenhard J, Harrer S, Wirtz G (2013) Measuring the installability of service orchestrations using the SQuaRE method. In: IEEE international conference on service-oriented computing and applications, IEEE, Kauai, Hawaii, USAGoogle Scholar
  39. 39.
    Lenhard J, Schönberger A, Wirtz G (2011) Edit distance-based pattern support assessment of orchestration languages. In: 19th international conference on cooperative informtion systems, Hersonissos, GreeceGoogle Scholar
  40. 40.
    Lenhard J, Wirtz G (2013) Measuring the portability of service-oriented processes. In: 17th IEEE international enterprise distributed object computing conference (EDOC2013), Vancouver, CanadaGoogle Scholar
  41. 41.
    Letouzey JL, Ilkiewicz M (2012) Managing technical debt with the SQUALE method. IEEE Softw 29(6):44–51CrossRefGoogle Scholar
  42. 42.
    Leymann F (2010) BPEL vs. BPMN 2.0: should you care? In: 2nd international workshop on BPMN, Potsdam, GermanyGoogle Scholar
  43. 43.
    Lohmann N, Verbeek E, Dijkman RM (2009) Petri net transformations for business processes—a survey. In: Transactions on Petri nets and other models of concurrency, vol 2, pp 46–63Google Scholar
  44. 44.
    Mann HB, Whitney DR (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18(1):50–60MathSciNetCrossRefzbMATHGoogle Scholar
  45. 45.
    Meneely A, Smith B, Williams L (2012) Validating software metrics: a spectrum of philosophies. ACM Trans Softw Eng Methodol 21(4):1–28Google Scholar
  46. 46.
    Muketha G, Ghani A, Selamat M, Atan R (2010) Complexity metrics for executable business processes. Inf Technol J 9(7):1317–1326CrossRefGoogle Scholar
  47. 47.
    OASIS: web services business process execution language, V2.0 (2007)Google Scholar
  48. 48.
    OASIS: topology and orchestration specification for cloud applications, version 1.0 (2013)Google Scholar
  49. 49.
    OMG: business process model and notation (BPMN) version 2.0 (2011)Google Scholar
  50. 50.
    Ortega M, Pérez M, Rojas T (2003) Construction of a systemic quality model for evaluating a software product. Softw Qual J 11(3):219–242CrossRefGoogle Scholar
  51. 51.
    Ouyang C, Dumas M, van der Aalst Wil MP, ter Hofstede Arthur HM, Mendling J (2009) From business process models to process-oriented software systems. ACM Trans Softw Eng Methodol 19(2)Google Scholar
  52. 52.
    Overhage S, Birkmeier D, Schlauderer S (2012) Quality marks, metrics and measurement procedures for business process models: the 3QM-framework. Bus Inf Syst Eng 4(5):229–246CrossRefGoogle Scholar
  53. 53.
    Peltz C (2003) Web services orchestration and choreography. IEEE Comput 36(10):46–52CrossRefGoogle Scholar
  54. 54.
    Perepletchikov M, Ryan C, Frampton K, Tari Z (2007) Coupling metrics for predicting maintainability in service-oriented designs. In: IEEE Australian software engineering conferenceGoogle Scholar
  55. 55.
    Petcu D, Macariu G, Panica S, Crăciun C (2013) Portable cloud applications—from theory to practice. Future Gener Comput Syst 29(6):1417–1430CrossRefGoogle Scholar
  56. 56.
    R Core Team (2013) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna.
  57. 57.
    Shapiro S, Wilk MB (1965) An analysis of variance test for normality (complete samples). Biometrika 52(3–4):591–611MathSciNetCrossRefzbMATHGoogle Scholar
  58. 58.
    Simon B, Goldschmidt B, Kondorosi K (2010) A human readable platform independent domain specific language for BPEL. In: 2nd international conference on networked digital technologies, Prague, Czech RepublicGoogle Scholar
  59. 59.
    SOA Manifesto Working Group (2009) SOA Manifesto. In: SOA symposium, Rotterdam, The NetherlandsGoogle Scholar
  60. 60.
    Sun K, Li Y (2013) Effort estimation in cloud migration process. In: 7th IEEE International symposium on service-oriented engineering, San Francisco Bay, USAGoogle Scholar
  61. 61.
    Tan W, Fan Y, Zhou M (2009) A Petri net-based method for compatibility analysis and composition of web services in business process execution language. IEEE Trans Autom Sci Eng 6(1):94–106CrossRefGoogle Scholar
  62. 62.
    van der Aalst WMP, ter Hofstede AHM (2005) YAWL: yet another workflow language. Inf Syst 30(4):245–275CrossRefGoogle Scholar
  63. 63.
    van der Aalst WMP, ter Hofstede AHM, Kiepuszewski B, Barros AP (2003) Workflow patterns. Distrib Parallel Databases 14(1):5–51CrossRefGoogle Scholar
  64. 64.
    Vanderfeesten I, Cardoso J, Mendling J, Reijers H, van der Aalst W (2007) Quality metrics for business process models. Future Strategies, Lighthouse PointGoogle Scholar
  65. 65.
    Wang Y, Taher Y, van den Heuvel WJ (2012) Towards smart service networks: an interdisciplinary service assessment metrics. In: 4th international workshop on service oriented enterprise architecture for enterprise engineering, Beijing, ChinaGoogle Scholar
  66. 66.
    Weyuker E (1988) Evaluating software complexity measures. IEEE Trans Software Eng 14(9):1357–1365MathSciNetCrossRefGoogle Scholar
  67. 67.
    WfMC: process definition interface—XML process definition language, V2.2 (2012)Google Scholar
  68. 68.
    White B (2012) Pro WF 4.5. Apress. ISBN-13: 978-1-4302-4383-0Google Scholar
  69. 69.
    Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83CrossRefGoogle Scholar
  70. 70.
    Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2012) Experimentation in software engineering. Springer, BerlinCrossRefzbMATHGoogle Scholar
  71. 71.
    zur Muehlen M, Recker J (2013) How much language is enough? Theoretical and practical use of the business process modeling notation. Seminal contributions to information systems engineering. Springer, Berlin. ISBN: 978-3642369261Google Scholar

Copyright information

© Springer-Verlag London 2016

Authors and Affiliations

  1. 1.Department of Mathematics and Computer ScienceKarlstad UniversityKarlstadSweden
  2. 2.Distributed Systems GroupUniversity of BambergBambergGermany

Personalised recommendations