Advertisement

IT-Centric Process Automation: Study About the Performance of BPMN 2.0 Engines

Chapter
  • 263 Downloads

Abstract

Workflow management systems (WfMSs) are broadly used in enterprise to design, deploy, execute, monitor, and analyze automated business processes. Current state-of-the-art WfMSs evolved into platforms delivering complex service-oriented applications that need to satisfy enterprise-grade performance requirements. With the ever growing number of WfMSs that are available in the market, companies are called to choose which product is optimal for their requirements and business models. Factors that WfMS vendors use to differentiate their products are mainly related to functionality and integration with other systems and frameworks. They usually do not differentiate their systems in terms of performance in handling the workload they are subject to or in terms of hardware resource consumption. Recent trend saw WfMSs deployed on environments where performance in handling the workload really matters, because they are subject to handling millions of workflow instances per day, as does the efficiency in terms of resource consumption, e.g., if they are deployed in the Cloud. Benchmarking is an established practice to compare alternative products, which helps to drive the continuous improvement of technology by setting a clear target in measuring and assessing its performance. In particular for WfMSs, there is not yet a standard accepted benchmark, even if standard workflow modeling and execution languages such as BPMN 2.0 have recently appeared. In this chapter, we present the challenges of establishing the first standard benchmark for assessing and comparing the performance of WfMSs in a way that is compliant to the main requirements of a benchmark: portability, scalability, simplicity, vendor neutrality, repeatability, efficiency, representativeness, relevance, accessibility, and affordability. A possible solution is also discussed, together with a use case of micro-benchmarking of open-source production WfMSs. The use case demonstrates the relevance of benchmarking the performance of WfMSs by showing relevant differences in terms of performance and resource consumption among the benchmarked WfMSs.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
    D. Bianculli, W. Binder, M.L. Drago, SOABench: performance evaluation of service-oriented middleware made easy, in 2010 ACM/IEEE 32nd International Conference on Software Engineering, vol. 2 (IEEE, Piscataway, 2010), pp. 301–302Google Scholar
  3. 3.
    J. Cardoso, Business process control-flow complexity: metric, evaluation, and validation. Int. J. Web Serv. Res. 5(2), 49–76 (2008)CrossRefGoogle Scholar
  4. 4.
    M.B. Chhetri, S. Chichin, Q.B.Vo, R. Kowalczyk, Smart CloudBench—a framework for evaluating cloud infrastructure performance. Inf. Syst. Front. 18(3), 413–428 (2016)CrossRefGoogle Scholar
  5. 5.
    F. Daniel, G. Pozzi, Y. Zhang, Workflow engine performance evaluation by a black-box approach, in Proceedings of the International Conference on Informatics Engineering & Information Science (ICIEIS ’11), ICIEIS ’11 (Springer, Berlin, 2011), pp. 189–203Google Scholar
  6. 6.
    W. Felter, A. Ferreira, R. Rajamony, J. Rubio, An updated performance comparison of virtual machines and linux containers, in 2015 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS) (IEEE, Piscataway, 2015), pp. 171–172Google Scholar
  7. 7.
    V. Ferme, C. Pautasso, A declarative approach for performance tests execution in continuous software development environments, in ICPE ’18(ACM, New York, 2018), pp. 261–272Google Scholar
  8. 8.
    V. Ferme, A. Ivanchikj, C. Pautasso, M. Skouradaki, F. Leymann, A container-centric methodology for benchmarking workflow management systems, in CLOSER (2) (SciTePress, Setúbal, 2016), pp. 74–84Google Scholar
  9. 9.
    M. Geiger, S. Harrer, J. Lenhard, G. Wirtz, On the evolution of bpmn 2.0 support and implementation, in 2016 IEEE Symposium on Service-Oriented System Engineering (SOSE)(IEEE, Piscataway, 2016), pp. 101–110Google Scholar
  10. 10.
    M. Gillmann, R. Mindermann, G. Weikum, Benchmarking and configuration of workflow management systems, in Proceedings of the 7th International Conference on Cooperative Information Systems (CoopIS ’00), CoopIS ’00 (Springer, Berlin, 2000), pp. 186–197Google Scholar
  11. 11.
    J. Gray, Benchmark Handbook: For Database and Transaction Processing Systems (Morgan Kaufmann Publishers Inc., Burlington, 1992)zbMATHGoogle Scholar
  12. 12.
    IBM, Sap netweaver business process management performance, scalability, and stability proof of concept. IBM White Paper WP102045 (2011), http://www-01.ibm.com/support/docview.wss?uid=tss1wp102045&aid=1
  13. 13.
    A. Iosup, R. Prodan, D. Epema, IaaS cloud benchmarking: approaches, challenges, and experience, in Cloud Computing for Data-Intensive Applications (Springer, New York, 2014), pp. 83–104Google Scholar
  14. 14.
    D. Jordan, J. Evdemon, Business Process Model and Notation (BPMN) Version 2.0 (Object Management Group, Needham, 2011), http://www.omg.org/spec/BPMN/2.0/
  15. 15.
    D. Jordan, J. Evdemon et al., Web Services Business Process Execution Language (WS-BPEL) Version 2.0, in OASIS Standard (2007), pp. 1–264Google Scholar
  16. 16.
    E.D. Lazowska, J. Zahorjan, et al., Quantitative System Performance: Computer System Analysis Using Queueing Network Models (Prentice-Hall, Upper Saddle River, 1984)Google Scholar
  17. 17.
    J. Mendling, Metrics for Process Models: Empirical Foundations of Verification, Error Prediction, and Guidelines for Correctness (Springer, Berlin, 2008)CrossRefGoogle Scholar
  18. 18.
    I. Molyneaux, The Art of Application Performance Testing: Help for Programmers and Quality Assurance, 1st edn. (O’Reilly, Sebastopol, 2009)Google Scholar
  19. 19.
    D.C. Montgomery, G.C. Runger, Applied Statistics and Probability for Engineers (Wiley, Hoboken, 2010)zbMATHGoogle Scholar
  20. 20.
    C. Pautasso, D. Roller, F. Leymann, V. Ferme, M. Skouradaki, Towards workflow benchmarking: open research challenges. Datenbanksysteme für Business, Technologie und Web (BTW 2015) (Gesellschaft für Informatik, Bonn, 2015)Google Scholar
  21. 21.
    C. Röck, S. Harrer, G. Wirtz, Performance benchmarking of BPEL engines: a comparison framework, status quo evaluation and challenges, in Proc. SEKE 2014 (2014), pp. 31–34Google Scholar
  22. 22.
    F. Rottensteiner, G. Sohn, M. Gerke, J.D. Wegner, U. Breitkopf, J. Jung, Results of the ISPRS benchmark on urban object detection and 3D building reconstruction. ISPRS J. Photogramm. Remote Sens. 93, 256–271 (2014)CrossRefGoogle Scholar
  23. 23.
    N. Russell, W.M.P. van der Aalst, A.H.M. ter Hofstede, All that glitters is not gold: selecting the right tool for your BPM needs. Cut. IT J. 20(11), 31–38 (2007)Google Scholar
  24. 24.
    S.E. Sim, S. Easterbrook, R.C. Holt, Using benchmarking to advance research: a challenge to software engineering, in Proceedings of the 25th International Conference on Software Engineering (ICSE ’03) (IEEE Computer Society, Washington, 2003), pp. 74–83Google Scholar
  25. 25.
    M. Skouradaki, Workload mix definition for benchmarking BPMN 2.0 Workflow Management Systems. PhD thesis, Institute of Architecture of Application Systems (IAAS), University of Stuttgart, 2017Google Scholar
  26. 26.
    M. Skouradaki, F. Leymann, Detecting frequently recurring structures in BPMN 2.0 process models, in Proceedings of the 9th Symposium and Summer School On Service-Oriented Computing: SummerSOC’15 (IBM, North Castle, 2015), pp. 102–116Google Scholar
  27. 27.
    M. Skouradaki, D.H. Roller et al., On the road to benchmarking BPMN 2.0 workflow engines, in Proceedings of the 6th ACM/SPEC International Conference on Performance Engineering, ICPE ’15 (ACM, New York, 2015), pp. 301–304Google Scholar
  28. 28.
    M. Skouradaki, V. Andrikopoulos, F. Leymann, Representative BPMN 2.0 process model generation from recurring structures, in 2016 IEEE International Conference on Web Services (ICWS) (IEEE, Piscataway, 2016), pp. 468–475Google Scholar
  29. 29.
    Standard Performance Evaluation Corporation. SPEC CPU2006 Version 1.2 (2011)Google Scholar
  30. 30.
    S. Subramanyam, Faban - helping measure performance, http://faban.org
  31. 31.
    Transaction Processing Council (TPC), TPC Benchmark C (Online Transaction Processing Benchmark) Version 5.11 (1997)Google Scholar
  32. 32.
    A. van Hoorn, C. Vögele, et al., Automatic extraction of probabilistic workload specifications for load testing session-based application systems, in Proc. VALUETOOLS ’14 (ICST, 2014), pp. 139–146Google Scholar
  33. 33.
    B. Wetzstein, P. Leitner, et al., Monitoring and analyzing influential factors of business process performance, in 2009 IEEE International Enterprise Distributed Object Computing Conference, EDOC ’09 (IEEE, Piscataway, 2009), pp. 141–150Google Scholar
  34. 34.
    P. Wohed, W.M.P. van der Aalst, M. Dumas et al., On the suitability of BPMN for business process modelling, in Business Process Management. LNCS, vol. 4102 (Springer, Berlin, 2006), pp. 161–176Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Software InstituteFaculty of Informatics, USILuganoSwitzerland
  2. 2.Institute of Architecture of Application Systems (IAAS)University of StuttgartStuttgartGermany

Personalised recommendations