Advertisement

Open Source versus Proprietary Software in Service-Orientation: The Case of BPEL Engines

  • Simon Harrer
  • Jörg Lenhard
  • Guido Wirtz
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8274)

Abstract

It is a long-standing debate, whether software that is developed as open source is generally of higher quality than proprietary software. Although the open source community has grown immensely during the last decade, there is still no clear answer. Service-oriented software and middleware tends to rely on highly complex and interrelated standards and frameworks. Thus, it is questionable if small and loosely coupled teams, as typical in open source software development, can compete with major vendors. Here, we focus on a central part of service-oriented software systems, i.e., process engines for service orchestration, and compare open source and proprietary solutions. We use the Web Services Business Process Execution Language (BPEL) and compare standard conformance and its impact on language expressiveness in terms of workflow pattern support of eight engines. The results show that, although the top open source engines are on par with their proprietary counterparts, in general proprietary engines perform better.

Keywords

open source SOA BPEL patterns conformance testing 

References

  1. 1.
    Bianculli, D., Binder, W., Drago, M.L.: Automated Performance Assessment for Service-Oriented Middleware: a Case Study on BPEL Engines. In: Proceedings of the 19th International World Wide Web Conference (WWW), Raleigh, North Carolina, USA, pp. 141–150 (April 2010)Google Scholar
  2. 2.
    Börger, E.: Approaches to modeling business processes: a critical analysis of BPMN, workflow patterns and YAWL. Software & Systems Modeling 11(3), 305–318 (2012)CrossRefGoogle Scholar
  3. 3.
    Bozkurt, M., Harman, M., Hassoun, Y.: Testing & Verification In Service-Oriented Architecture: A Survey. Software Testing, Verificaton and Reliability, 1–7 (2012)Google Scholar
  4. 4.
    Bukovics, B.: Pro WF: Windows Workflow in .NET 4. Apress (June 2010) ISBN-13: 978-1-4302-2721-2Google Scholar
  5. 5.
    Geiger, M., Schönberger, A., Wirtz, G.: Towards Automated Conformance Checking of ebBP-ST Choreographies and Corresponding WS-BPEL Based Orchestrations. In: 23rd International Conference on Software Engineering and Knowledge Engineering, Miami, Florida, USA, July 7-9, KSI (2011)Google Scholar
  6. 6.
    Geiger, M., Wirtz, G.: BPMN 2.0 Serialization - Standard Compliance Issues and Evaluation of Modeling Tools. In: 5th International Workshop on Enterprise Modelling and Information Systems Architectures, St. Gallen, Switzerland (September 2013)Google Scholar
  7. 7.
    Guidi, C., Lanese, I., Montesi, F., Zavattaro, G.: On the Interplay Between Fault Handling and Request-Response Service Interactions. In: 8th International Conference on Application of Concurrency to System Design (ACSD), Xi’an, China, pp. 190–198 (June 2008)Google Scholar
  8. 8.
    Harrer, S., Lenhard, J., Wirtz, G.: BPEL Conformance in Open Source Engines. In: Proceedings of the 5th IEEE International Conference on Service-Oriented Computing and Applications (SOCA2012), Taipei, Taiwan, December 17-19. IEEE (2012)Google Scholar
  9. 9.
    Harrer, S., Schönberger, A., Wirtz, G.: A Model-Driven Approach for Monitoring ebBP BusinessTransactions. In: Proceedings of the 7th World Congress on Services 2011 (SERVICES 2011). IEEE, Washington, D.C. (2011)Google Scholar
  10. 10.
    Hoepman, J., Jacobs, B.: Increased Security Through Open Source. Communications of the ACM 50(1), 79–83 (2007)CrossRefGoogle Scholar
  11. 11.
    Hofreiter, B., Huemer, C.: A model-driven top-down approach to inter-organizational systems: From global choreography models to executable BPEL. In: Join Conf. CEC, EEE, Hong Kong, China (2008)Google Scholar
  12. 12.
    IETF. Key words for use in RFCs to Indicate Requirement Levels (March 1997), RFC 2119Google Scholar
  13. 13.
    Juszczyk, L., Dustdar, S.: Programmable Fault Injection Testbeds for Complex SOA. In: Maglio, P.P., Weske, M., Yang, J., Fantinato, M. (eds.) ICSOC 2010. LNCS, vol. 6470, pp. 411–425. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  14. 14.
    Kaschner, K.: Conformance Testing for Asynchronously Communicating Services. In: Kappel, G., Maamar, Z., Motahari-Nezhad, H.R. (eds.) ICSOC 2011. LNCS, vol. 7084, pp. 108–124. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  15. 15.
    Kuan, J.: Open Source Software as Lead User’s Make or Buy Decision: A Study of Open and Closed Source Quality. In: Proceedings of the 2nd Conference on The Economics of the Software and Internet Industries, Toulouse, France (January 2003)Google Scholar
  16. 16.
    Lanz, A., Weber, B., Reichert, M.: Workflow Time Patterns for Process-Aware Information Systems. In: Bider, I., Halpin, T., Krogstie, J., Nurcan, S., Proper, E., Schmidt, R., Ukor, R. (eds.) BPMDS 2010 and EMMSAD 2010. LNBIP, vol. 50, pp. 94–107. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  17. 17.
    Lübke, D.: Unit Testing BPEL Compositions. In: Baresi, L., Nitto, E.D. (eds.) Test and Analysis of Service-oriented Systems, pp. 149–171. Springer (2007) ISBN 978-3-540-72911-2Google Scholar
  18. 18.
    Lenhard, J.: A Pattern-based Analysis of WS-BPEL and Windows Workflow. Bamberger Beiträge zur Wirtschaftsinformatik und Angewandten Informatik, no. 88, Otto-Friedrich Universität Bamberg (March 2011)Google Scholar
  19. 19.
    Lenhard, J., Schönberger, A., Wirtz, G.: Edit Distance-Based Pattern Support Assessment of Orchestration Languages. In: Meersman, R., et al. (eds.) OTM 2011, Part I. LNCS, vol. 7044, pp. 137–154. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  20. 20.
    OASIS. Web Services Business Process Execution Language v2.0 (April 2007)Google Scholar
  21. 21.
    OMG. Business Process Model and Notation, v2.0 (January 2011)Google Scholar
  22. 22.
    Papazoglou, M.P., Georgakopoulos, D.: Service-oriented Computing. Communications of the ACM 46(10), 24–28 (2003)CrossRefGoogle Scholar
  23. 23.
    Peltz, C.: Web Services Orchestration and Choreography. IEEE Computer 36(10), 46–52 (2003)CrossRefGoogle Scholar
  24. 24.
    RosettaNet. MCC Web Services Profile, R11.00.00A (June 2010)Google Scholar
  25. 25.
    Russell, N., ter Hofstede, A.H.M., van der Aalst, W.M.P., Mulyar, N.: Workflow Control-Flow Patterns: A Revised View. Technical report, BPM Group, Queensland University of Technology; Department of Technology Management, Eindhoven University of Technology (2006)Google Scholar
  26. 26.
    Spinellis, D.: Quality Wars: Open Source Versus Proprietary Software. O’Reilly Media, Inc., Making Software (2011) ISBN: 978-0-596-80832-7Google Scholar
  27. 27.
    Stamelos, I., Angelis, L., Okionomou, A., Bleris, G.L.: Code quality analysis in open source software development. Information Systems Journal 12(1), 43–60 (2002)CrossRefGoogle Scholar
  28. 28.
    Thom, L.H., Reichert, M., Iochpe, C.: Activity Patterns in Process-aware Information Systems: Basic Concepts and Empirical Evidence. International Journal of Business Process Integration and Management (IJBPIM) 4(2), 93–110 (2009)CrossRefGoogle Scholar
  29. 29.
    van der Aalst, W., ter Hofstede, A.: YAWL: yet another workflow language. Information Systems 30(4), 245–275 (2005)CrossRefGoogle Scholar
  30. 30.
    van der Aalst, W.M.P., ter Hofstede, A.H.M., Kiepuszewski, B., Barros, A.P.: Workflow Patterns. Distributed and Parallel Databases 14(1), 5–51 (2003)CrossRefGoogle Scholar
  31. 31.
    WfMC. XML Process Definition Language, v2.2 (August 2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Simon Harrer
    • 1
  • Jörg Lenhard
    • 1
  • Guido Wirtz
    • 1
  1. 1.Distributed Systems GroupUniversity of BambergGermany

Personalised recommendations