Performance Comparison Between BPMN 2.0 Workflow Management Systems Versions

  • Vincenzo Ferme
  • Marigianna Skouradaki
  • Ana Ivanchikj
  • Cesare Pautasso
  • Frank Leymann
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 287)

Abstract

Software has become a rapidly evolving artifact and Workflow Management Systems (WfMSs) are not an exception. WfMSs’ changes may impact key performance indicators or resource consumption levels may change among different versions. Thus, users considering a WfMS upgrade need to evaluate the extent of such changes for frequently issued workload. Deriving such information requires running performance experiments with appropriate workloads. In this paper, we propose a novel method for deriving a structurally representative workload from a given business process collection, which we later use to evaluate the performance and resource consumption over four versions of two open-source WfMSs, for different numbers of simulated users. In our case study scenario the results reveal relevant variations in the WfMSs’ performance and resource consumption, indicating a decrease in performance for newer versions.

Keywords

Performance testing Performance regression BPMN Workflow management systems Workflow engine 

References

  1. 1.
    Bianculli, D., et al.: Automated performance assessment for service-oriented middleware: a case study on BPEL engines. In: Proceedings of WWW, pp. 141–150 (2010)Google Scholar
  2. 2.
    Duan, S., et al.: Apples and oranges: a comparison of RDF benchmarks and real RDF datasets. In: Proceedings of SIGMOD. Association for Computing Machinery (2011)Google Scholar
  3. 3.
    Feitelson, D.G.: Workload Modeling for Computer Systems Performance Evaluation. Cambridge University Press, New York (2015)CrossRefGoogle Scholar
  4. 4.
    Ferme, V., Ivanchikj, A., Pautasso, C.: A framework for benchmarking BPMN 2.0 workflow management systems. In: Motahari-Nezhad, H.R., Recker, J., Weidlich, M. (eds.) BPM 2015. LNCS, vol. 9253, pp. 251–259. Springer, Cham (2015). doi:10.1007/978-3-319-23063-4_18 CrossRefGoogle Scholar
  5. 5.
    Ferme, V., Ivanchikj, A., Pautasso, C., Skouradaki, M., Leymann, F.: A Container-centric methodology for benchmarking workflow management systems. In: CLOSER 2016 - Proceedings of the 6th International Conference on Cloud Computing and Services Science, vol. 2, Rome, Italy, April 23-25, pp. 74–84 (2016). doi:10.5220/0005908400740084
  6. 6.
    Gupta, A.: Generating large-scale heterogeneous graphs for benchmarking. In: Rabl, T., Poess, M., Baru, C., Jacobsen, H.-A. (eds.) WBDB -2012. LNCS, vol. 8163, pp. 113–128. Springer, Heidelberg (2014). doi:10.1007/978-3-642-53974-9_11 CrossRefGoogle Scholar
  7. 7.
    Hollingsworth, D.: The workflow reference model. WfMC 68 (1995)Google Scholar
  8. 8.
    Jordan, D., et al.: Business process model and notation (BPMN) version 2.0. Object Management Group, Inc. (2011). http://www.omg.org/spec/BPMN/2.0/
  9. 9.
    Molyneaux, I.: The Art of Application Performance Testing: From Strategy to Tools, 2nd edn. O’Reilly Media, Sebastopol (2014)Google Scholar
  10. 10.
    Montgomery, D.C., Runger, G.C.: Applied Statistics and Probability for Engineers. Wiley, New York (2003)Google Scholar
  11. 11.
    Muehlen, M., Recker, J.: How much language is enough? Theoretical and practical use of the business process modeling notation. In: Bellahsène, Z., Léonard, M. (eds.) CAiSE 2008. LNCS, vol. 5074, pp. 465–479. Springer, Heidelberg (2008). doi:10.1007/978-3-540-69534-9_35 CrossRefGoogle Scholar
  12. 12.
    Pietsch, P., Wenzel, S.: Comparison of BPMN2 diagrams. In: Mendling, J., Weidlich, M. (eds.) BPMN 2012. LNBIP, vol. 125, pp. 83–97. Springer, Heidelberg (2012). doi:10.1007/978-3-642-33155-8_7 CrossRefGoogle Scholar
  13. 13.
    Röck, C., et al.: Performance benchmarking of BPEL engines: a comparison framework, status quo evaluation and challenges. In: Proceedings of SEKE, pp. 31–34 (2014)Google Scholar
  14. 14.
    Skouradaki, M., Andrikopoulos, V., Leymann, F.: Representative BPMN 2.0 process model generation from recurring structures. In: Proceedings of ICWS 2016 (2016)Google Scholar
  15. 15.
    Skouradaki, M., Ferme, V., Pautasso, C., Leymann, F., Hoorn, A.: Micro-benchmarking BPMN 2.0 workflow management systems with workflow patterns. In: Nurcan, S., Soffer, P., Bajec, M., Eder, J. (eds.) CAiSE 2016. LNCS, vol. 9694, pp. 67–82. Springer, Cham (2016). doi:10.1007/978-3-319-39696-5_5 Google Scholar
  16. 16.
    Skouradaki, M., Andrikopoulos, V., Kopp, O., Leymann, F.: RoSE: reoccurring structures detection in bpmn 2.0 process model collections. In: Debruyne, C. et al. (eds) On the Move to Meaningful Internet Systems: OTM 2016 Conferences. OTM 2016. LNCS, vol 10033. Springer, Cham (2016). doi:10.1007/978-3-319-48472-3_15
  17. 17.
    Vicknair, C., et al.: A comparison of a graph database and a relational database. In: Proceedings of ACM SE 2010. Association for Computing Machinery (ACM) (2010)Google Scholar
  18. 18.
    Wetzstein, B., et al.: Monitoring and analyzing influential factors of business process performance. In: Proceedings of EDOC 2009, pp. 141–150 (2009)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Vincenzo Ferme
    • 1
  • Marigianna Skouradaki
    • 2
  • Ana Ivanchikj
    • 1
  • Cesare Pautasso
    • 1
  • Frank Leymann
    • 2
  1. 1.Faculty of InformaticsUSI LuganoLuganoSwitzerland
  2. 2.Institute of Architecture of Application Systems (IAAS)University of StuttgartStuttgartGermany

Personalised recommendations