A Framework for Benchmarking BPMN 2.0 Workflow Management Systems

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9253)

Abstract

The diverse landscape of Workflow Management Systems (WfMSs) makes it challenging for users to compare different solutions to identify the ones most suitable to their requirements. Thus a comparison framework that would define common grounds in many different aspects, such as price, reliability, security, robustness and performance is necessary. In this paper we focus on the performance aspect, and we present a framework for automatic and reliable calculation of performance metrics for BPMN 2.0 WfMSs. We validate the framework by applying it on two open-source WfMSs. The goal is to contribute to the improvement of existing WfMSs by pinpointing performance bottlenecks, and to empower end users to make informed decisions when selecting a WfMS.

Keywords

BPMN 2.0 Workflow management systems Benchmarking 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bianculli, D., Binder, W., et al.: Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: 19th WWW, pp. 141–150. ACM, Raleigh (2010)Google Scholar
  2. 2.
    Felter, W., Ferreira, A., Rajamony, R., Rubio, J.: An updated performance comparison of virtual machines and linux containers. Tech. rep, IBM, July 2014Google Scholar
  3. 3.
    Garro, J.M., Bazán, P.: Constructing and monitoring processes in BPM using hybrid architectures. IJACSA 3(4), 78–85 (2014)Google Scholar
  4. 4.
    Gillmann, M., Mindermann, R., et al.: Benchmarking and configuration of workflow management systems. In: Scheuermann, P., Etzion, O. (eds.) CoopIS 2000. LNCS, vol. 1901, pp. 186–197. Springer, Heidelberg (2000) CrossRefGoogle Scholar
  5. 5.
    Gray, J.: The Benchmark Handbook for Database and Transaction Systems, 2nd edn. Morgan Kaufmann, San Francisco (1992) Google Scholar
  6. 6.
    Harrer, S., Wirtz, G., Nizamic, F., Lazovik, A.: Towards a robustness evaluation framework for BPEL engines. In: 7th SOCA, pp. 199–206. IEEE, Matsue (2014)Google Scholar
  7. 7.
    Harrer, S., et al.: Automated and isolated tests for complex middleware products: the case of BPEL engines. In: 7th ICSTW, pp. 390–398. IEEE, Cleveland (2014)Google Scholar
  8. 8.
    IBM: IBM Rational Performance Tester. http://www.ibm.com/software/products/en/performance
  9. 9.
    Lazowska, E.D., et al.: Quantitative System Performance: Computer System Analysis Using Queueing Network Models. Prentice-Hall, Upper Saddle River (1984) Google Scholar
  10. 10.
    McGill, R., Tukey, J.W., Larsen, W.A.: Variations of box plots. The American Statistician 32(1), 12–16 (1978)Google Scholar
  11. 11.
    Molyneaux, I.: The Art of Application Performance Testing: From Strategy to Tools. O’Reilly Media Inc., Sebastopol (2014) Google Scholar
  12. 12.
    Röck, C., et al.: Performance benchmarking of BPEL engines: a comparisonframework, status quo evaluation and challenges. In: 26th SEKE, Pittsburgh(2014)Google Scholar
  13. 13.
    Skouradaki, M., Roller, D.H., et al.: On the road to benchmarking BPMN 2.0 workflow engines. In: 6th ICPE, pp. 301–304. ACM, Austin (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Vincenzo Ferme
    • 1
  • Ana Ivanchikj
    • 1
  • Cesare Pautasso
    • 1
  1. 1.Faculty of InformaticsUniversity of Lugano (USI)LuganoSwitzerland

Personalised recommendations