Lessons Learned from Evaluating Workflow Management Systems

  • Jörg LenhardEmail author
  • Vincenzo Ferme
  • Simon Harrer
  • Matthias Geiger
  • Cesare Pautasso
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10797)


Workflow Management Systems (WfMSs) today act as service composition engines and service-oriented middleware to enable the execution of automated business processes. Automation based on WfMSs promises to enable the model-driven construction of flexible and easily maintainable services with high-performance characteristics. In the past decade, significant effort has been invested into standardizing WfMSs that compose services, with standards such as the Web Services Business Process Execution Language (WS-BPEL) or the Business Process Model and Notation (BPMN). One of the aims of standardization is to enable users of WfMSs to compare different systems and to avoid vendor lock-in. Despite these efforts, there are many expectations concerning portability, performance efficiency, usability, reliability and maintainability of WfMSs that are likely to be unfulfilled. In this work, we synthesize the findings of two research initiatives that deal with WfMSs conformance and performance benchmarking to distill a set of lessons learned and best practices. These findings provide useful advice for practitioners who plan to evaluate and use WfMSs and for WfMS vendors that would like to foster wider adoption of process-centric service composition middleware.


Workflow management systems Standards Lessons learned Evaluation research Benchmarking Service composition 


  1. 1.
    Bianculli, D., Binder, W., Drago, M.L.: Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: 19th WWW, Raleigh, North Carolina, USA, pp. 141–150, April 2010Google Scholar
  2. 2.
    Börger, E.: Approaches to modeling business processes: a critical analysis of BPMN, workflow patterns and YAWL. Softw. Syst. Model. 11(3), 305–318 (2012)CrossRefGoogle Scholar
  3. 3.
    Delgado, A., Calegari, D., Milanese, P., Falcon, R., García, E.: A systematic approach for evaluating BPM systems: case studies on open source and proprietary tools. In: Damiani, E., Frati, F., Riehle, D., Wasserman, A.I. (eds.) OSS 2015. IAICT, vol. 451, pp. 81–90. Springer, Cham (2015). Scholar
  4. 4.
    Ferme, V., Ivanchikj, A., Pautasso, C., Skouradaki, M., Leymann, F.: A container-centric methodology for benchmarking workflow management systems. In: 6th CLOSER, Rome, Italy (2016)Google Scholar
  5. 5.
    Ferme, V., Lenhard, J., Harrer, S., Geiger, M., Pautasso, C.: Workflow management systems benchmarking: unfulfilled expectations and lessons learned (extended abstract). In: 39th ICSE Companion, Poster Track (2017)Google Scholar
  6. 6.
    Ferme, V., Skouradaki, M., Ivanchikj, A., Pautasso, C., Leymann, F.: Performance comparison between BPMN 2.0 workflow management systems versions. In: Reinhartz-Berger, I., Gulden, J., Nurcan, S., Guédria, W., Bera, P. (eds.) BPMDS/EMMSAD -2017. LNBIP, vol. 287, pp. 103–118. Springer, Cham (2017). Scholar
  7. 7.
    Garcês, R., Jesus, T., Cardoso, J., Valente, P.: Open source workflow management systems: a concise survey. In: BPM & Workflow Handbook. Future Strategies (2009)Google Scholar
  8. 8.
    Geiger, M., Harrer, S., Lenhard, J., Wirtz, G.: BPMN 2.0: the state of support and implementation. Future Gener. Comput. Syst. 80, 250–262 (2017)CrossRefGoogle Scholar
  9. 9.
    Geiger, M., Wirtz, G.: BPMN 2.0 serialization - standard compliance issues and evaluation of modeling tools. In: 5th EMISA, September 2013Google Scholar
  10. 10.
    Harrer, S., Lenhard, J.: Betsy-a BPEL engine test system. Technical report 90, Otto-Friedrich Universität Bamberg, July 2012Google Scholar
  11. 11.
    Harrer, S., Lenhard, J., Wirtz, G.: BPEL conformance in open source engines. In: 5th IEEE SOCA, pp. 237–244, December 2012Google Scholar
  12. 12.
    Harrer, S., Lenhard, J., Wirtz, G.: Open source versus proprietary software in service-orientation: the case of BPEL engines. In: Basu, S., Pautasso, C., Zhang, L., Fu, X. (eds.) ICSOC 2013. LNCS, vol. 8274, pp. 99–113. Springer, Heidelberg (2013). Scholar
  13. 13.
    Harrer, S., Nizamic, F., Wirtz, G., Lazovik, A.: Towards a robustness evaluation framework for BPEL engines. In: 7th IEEE SOCA, pp. 199–206, November 2014Google Scholar
  14. 14.
    Harrer, S., Preißinger, C., Wirtz, G.: BPEL conformance in open source engines: the case of static analysis. In: 7th IEEE SOCA, pp. 33–40, November 2014Google Scholar
  15. 15.
    ISO/IEC: ISO/IEC 25010:2011; Systems and software engineering - Systems and software Quality Requirements and Evaluation (SQuaRE) - System and software quality models (2011)Google Scholar
  16. 16.
    ISO/IEC: ISO/IEC 19510:2013 - Information technology - Object Management Group Business Process Model and Notation (2013). v2.0.2Google Scholar
  17. 17.
    Lenhard, J., Wirtz, G.: Portability of executable service-oriented processes: metrics and validation. Serv. Oriented Comput. Appl. 10(4), 391–411 (2016)CrossRefGoogle Scholar
  18. 18.
    Leymann, F.: BPEL vs. BPMN 2.0: should you care? In: Mendling, J., Weidlich, M., Weske, M. (eds.) BPMN 2010. LNBIP, vol. 67, pp. 8–13. Springer, Heidelberg (2010). Scholar
  19. 19.
    López, M., Ferreiro, H., Francisco, M.A., Castro, L.M.: Automatic generation of test models for web services using WSDL and OCL. In: Basu, S., Pautasso, C., Zhang, L., Fu, X. (eds.) ICSOC 2013. LNCS, vol. 8274, pp. 483–490. Springer, Heidelberg (2013). Scholar
  20. 20.
    OASIS: Web Services Business Process Execution Language (2007). v2.0Google Scholar
  21. 21.
    Peltz, C.: Web services orchestration and choreography. Computer 36(10), 46–52 (2003)CrossRefGoogle Scholar
  22. 22.
    Skouradaki, M., Ferme, V., Pautasso, C., Leymann, F., van Hoorn, A.: Micro-benchmarking BPMN 2.0 workflow management systems with workflow patterns. In: Nurcan, S., Soffer, P., Bajec, M., Eder, J. (eds.) CAiSE 2016. LNCS, vol. 9694, pp. 67–82. Springer, Cham (2016). Scholar
  23. 23.
    Skouradaki, M., Roller, D.H., Leymann, F., Ferme, V., Pautasso, C.: On the road to benchmarking BPMN 2.0 workflow engines. In: 6th ACM/SPEC ICPE, pp. 301–304. ACM (2015)Google Scholar
  24. 24.
    Thiemich, C., Puhlmann, F.: An agile BPM project methodology. In: Daniel, F., Wang, J., Weber, B. (eds.) BPM 2013. LNCS, vol. 8094, pp. 291–306. Springer, Heidelberg (2013). Scholar
  25. 25.
    Tsai, W.T., Song, W., Paul, R., Cao, Z., Huang, H.: Services-oriented dynamic reconfiguration framework for dependable distributed computing. In: COMPSAC, pp. 554–559 (2004)Google Scholar
  26. 26.
    Wieringa, R., Maiden, N., Mead, N., Rolland, C.: Requirements engineering paper classification and evaluation criteria: a proposal and a discussion. RE 11(1), 102–107 (2006)Google Scholar
  27. 27.
    Wohed, P., van der Aalst, W.M.P., Dumas, M., ter Hofstede, A.H.M., Russell, N.: On the suitability of BPMN for business process modelling. In: Dustdar, S., Fiadeiro, J.L., Sheth, A.P. (eds.) BPM 2006. LNCS, vol. 4102, pp. 161–176. Springer, Heidelberg (2006). Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Jörg Lenhard
    • 1
    Email author
  • Vincenzo Ferme
    • 2
  • Simon Harrer
    • 3
  • Matthias Geiger
    • 3
  • Cesare Pautasso
    • 2
  1. 1.Department of Mathematics and Computer ScienceKarlstad UniversityKarlstadSweden
  2. 2.Software Institute, Faculty of InformaticsUSILuganoSwitzerland
  3. 3.Distributed Systems GroupUniversity of BambergBambergGermany

Personalised recommendations