Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9694)

Abstract

Although Workflow Management Systems (WfMSs) are a key component in workflow technology, research work for assessing and comparing their performance is limited. This work proposes the first micro-benchmark for WfMSs that can execute BPMN 2.0 workflows. To this end, we focus on studying the performance impact of well-known workflow patterns expressed in BPMN 2.0 with respect to three open source WfMSs. We executed all the experiments under a reliable environment and produced a set of meaningful metrics. This paper contributes to the area of workflow technology by defining building blocks for more complex BPMN 2.0 WfMS benchmarks. The results have shown bottlenecks on architectural design decisions, resource utilization, and limits on the load a WfMS can sustain, especially for the cases of complex and parallel structures. Experiments on a mix of workflow patterns indicated that there are no unexpected performance side effects when executing different workflow patterns concurrently, although the duration of the individual workflows that comprised the mix was increased.

Keywords

Benchmarking Micro-benchmark BPMN 2.0 Workflow Patterns Workflow Management Systems 

References

  1. 1.
    Angles, R., Prat-Pérez, A., et al.: Benchmarking database systems for social network applications. In: Proceedings of GRADES 2013, pp. 15:1–15:7. ACM (2013)Google Scholar
  2. 2.
    Bianculli, D., Binder, W., Drago, M.L.: Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: Proceedings of WWW 2010, pp. 141–150 (2010)Google Scholar
  3. 3.
    Camunda Services GmBH: Camunda BPM, October 2015. https://camunda.org/
  4. 4.
    Faban: Performance framework, December 2014. http://faban.org
  5. 5.
    Felter, W., Ferreira, A., et al.: An updated performance comparison of virtual machines and Linux containers. Technical report, IBM, July 2014Google Scholar
  6. 6.
    Ferme, V., Ivanchikj, A., Pautasso, C.: A framework for benchmarking BPMN 2.0 workflow management systems. In: Motahari-Nezhad, H.Z., Recker, J., Weidlich, M., et al. (eds.) BPM 2015. LNCS, vol. 9253, pp. 251–259. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  7. 7.
    Ferme, V., Pautasso, C.: Integrating faban with docker for performance benchmarking. In: Proceedings of the 7th ACM/SPEC International Conference on PerformanceEngineering, Delft, The Netherlands, March 2016Google Scholar
  8. 8.
    Geiger, M., Harrer, S., Lenhard, J., Casar, M., Vorndran, A., Wirtz, G.: BPMN conformance in open source engines. In: Proceedings of the 9th IEEE International Symposium on Service-Oriented System Engineering (SOSE 2015), SOSE 2015, March 30–April 3 2015. IEEE, San Francisco Bay (2015)Google Scholar
  9. 9.
    Jordan, D., Evdemon, J.: Business Process Model And Notation (BPMN) Version 2.0. Object Management Group, Inc. (2011). http://www.omg.org/spec/BPMN/2.0/
  10. 10.
    von Kistowski, J., Arnold, J.A., et al.: How to build a benchmark. In: Proceedings of ICPE 2015, pp. 333–336 (2015)Google Scholar
  11. 11.
    Lazowska, E.D., Zahorjan, J., et al.: Quantitative System Performance: Computer System Analysis Using Queueing Network Models. Prentice-Hall, Englewood Cliffs (1984)Google Scholar
  12. 12.
    Mendes, M.R.N., Bizarro, P., Marques, P.: A performance study of event processing systems. In: Nambiar, R., Poess, M. (eds.) TPCTC 2009. LNCS, vol. 5895, pp. 221–236. Springer, Heidelberg (2009)Google Scholar
  13. 13.
    Merkel, D.: Docker: Lightweight Linux containers for consistent development and deployment. Linux J. 2014(239), 2 (2014)Google Scholar
  14. 14.
    Montgomery, D.C., Runger, G.C.: Applied Statistics and Probability for Engineers. Wiley, New York (2003)MATHGoogle Scholar
  15. 15.
    Muehlen, M.Z.: Workflow-Based Process Controlling. Logos, Berlin (2004)Google Scholar
  16. 16.
    Röock, C., Harrer, S., Wirtz, G.: Performance benchmarking of BPEL engines: a comparison framework, status quo evaluation and challenges. In: Proceedings of SEKE 2014, pp. 31–34 (2014)Google Scholar
  17. 17.
    Skouradaki, M., Roller, D.H., et al.: On the road to benchmarking BPMN 2.0 workflow engines. In: Proceedings of ICPE 2015, pp. 301–304 (2015)Google Scholar
  18. 18.
    Standard Performance Evaluation Corporation: SPEC’s Benchmarks (SPEC). https://www.spec.org/benchmarks.html
  19. 19.
    Van Der Aalst, W.M.P., Hofstede, T., et al.: Workflow patterns. Distrib. Parallel Databases 14(1), 5–51 (2003)CrossRefGoogle Scholar
  20. 20.
    Waller, J., Hasselbring, W.: A benchmark engineering methodology to measure the overhead of application-level monitoring. In: KPDAYS. CEUR Workshop Proceedings, vol. 1083, pp. 59–68. CEUR-WS.org (2013)Google Scholar
  21. 21.
    Wetzstein, B., Leitner, P., et al.: Monitoring and analyzing influential factors of business process performance. In: Proceedings of EDOC 2009, pp. 141–150 (2009)Google Scholar
  22. 22.
    Wohed, P., van der Aalst, W.M.P., Dumas, M., ter Hofstede, A.H.M., Russell, N.: On the suitability of BPMN for business process modelling. In: Dustdar, S., Fiadeiro, J.L., Sheth, A.P. (eds.) BPM 2006. LNCS, vol. 4102, pp. 161–176. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Institute of Architecture of Application Systems (IAAS)University of StuttgartStuttgartGermany
  2. 2.Faculty of InformaticsUniversity of Lugano (USI)LuganoSwitzerland
  3. 3.Institute of Software Technology (ISTE)University of StuttgartStuttgartGermany

Personalised recommendations