Shadow Testing for Business Process Improvement

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11229)


A fundamental assumption of improvement in Business Process Management (BPM) is that redesigns deliver refined and improved versions of business processes. These improvements can be validated online through sequential experiment techniques like AB Testing, as we have shown in earlier work. Such approaches have the inherent risk of exposing customers to an inferior process version during the early stages of the test. This risk can be managed by offline techniques like simulation. However, offline techniques do not validate the improvements because there is no user interaction with the new versions. In this paper, we propose a middle ground through shadow testing, which avoids the downsides of simulation and direct execution. In this approach, a new version is deployed and executed alongside the current version, but in such a way that the new version is hidden from the customers and process workers. Copies of user requests are partially simulated and partially executed by the new version as if it were running in the production. We present an architecture, algorithm, and implementation of the approach, which isolates new versions from production, facilitates fair comparison, and manages the overhead of running shadow tests. We demonstrate the efficacy of our technique by evaluating the executions of synthetic and realistic process redesigns.


Shadow testing Business process management DevOps Live testing 



The work of Claudio Di Ciccio and Jan Mendling has received funding from the EU H2020 programme under MSCA-RISE agreement 645751 (RISE_BPM).


  1. 1.
    van der Aalst, W.M.P.: Business process simulation survival guide. In: vom Brocke, J., Rosemann, M. (eds.) Handbook on Business Process Management 1. IHIS, pp. 337–370. Springer, Heidelberg (2015). Scholar
  2. 2.
    Bass, L., Weber, I., Zhu, L.: DevOps - A Software Architect’s Perspective. SEI Series in Software Engineering. Addison-Wesley, Boston (2015)Google Scholar
  3. 3.
    Chiu, D., Jain, R.: Analysis of the increase and decrease algorithms for congestion avoidance in computer networks. Comput. Netw. 17, 1–14 (1989)zbMATHGoogle Scholar
  4. 4.
    Crook, T., Frasca, B., Kohavi, R., Longbotham, R.: Seven pitfalls to avoid when running controlled experiments on the web. In: KDD, pp. 1105–1114 (2009)Google Scholar
  5. 5.
    Davenport, T.H.: Process Innovation: Reengineering Work Through Information Technology. Harvard Business Press, Boston (1993)Google Scholar
  6. 6.
    Denery, D.G., Erzberger, H.: The center-TRACON automation system: simulation and field testing (1995)Google Scholar
  7. 7.
    Dumas, M., La Rosa, M., Mendling, J., Reijers, H.A.: Fundamentals of Business Process Management, 2nd edn. Springer, Heidelberg (2018). Scholar
  8. 8.
    Falk, T., Griesberger, P., Leist, S.: Patterns as an artifact for business process improvement - insights from a case study. In: vom Brocke, J., Hekkala, R., Ram, S., Rossi, M. (eds.) DESRIST 2013. LNCS, vol. 7939, pp. 88–104. Springer, Heidelberg (2013). Scholar
  9. 9.
    Feitelson, D.G., Frachtenberg, E., Beck, K.L.: Development and deployment at Facebook. IEEE Internet Comput. 17(4), 8–17 (2013)CrossRefGoogle Scholar
  10. 10.
    Hammer, M., Champy, J.: Reengineering the Corporation: A Manifesto for Business Revolution. HarperCollins, New York (1993)Google Scholar
  11. 11.
    Holland, C.W.: Breakthrough Business Results With MVT: A Fast, Cost-Free “Secret Weapon" for Boosting Sales, Cutting Expenses, and Improving Any Business Process. Wiley, Hoboken (2005)Google Scholar
  12. 12.
    Kettinger, W.J., Teng, J.T.C., Guha, S.: Business process change: a study of methodologies, techniques, and tools. MIS Q. 21(1), 55–98 (1997)CrossRefGoogle Scholar
  13. 13.
    Kevic, K., Murphy, B., Williams, L.A., Beckmann, J.: Characterizing experimentation in continuous deployment: a case study on bing. In: ICSE-SEIP, pp. 123–132. IEEE Press (2017)Google Scholar
  14. 14.
    Kohavi, R., Longbotham, R., Sommerfield, D., Henne, R.M.: Controlled experiments on the web: survey and practical guide. Data Min. Knowl. Discov. 18(1), 140–181 (2009)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Reijers, H.A., Mansar, S.L.: Best practices in business process redesign: an overview and qualitative evaluation of successful redesign heuristics. Omega 33(4), 283–306 (2005)CrossRefGoogle Scholar
  16. 16.
    Rogge-Solti, A., Weske, M.: Prediction of business process durations using non-Markovian stochastic Petri nets. Inf. Syst. 54, 1–14 (2015)CrossRefGoogle Scholar
  17. 17.
    Rozinat, A., Wynn, M.T., van der Aalst, W.M.P., ter Hofstede, A.H.M., Fidge, C.J.: Workflow simulation for operational decision support. Data Knowl. Eng. 68(9), 834–850 (2009)CrossRefGoogle Scholar
  18. 18.
    Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: AB-BPM: performance-driven instance routing for business process improvement. In: Carmona, J., Engels, G., Kumar, A. (eds.) BPM 2017. LNCS, vol. 10445, pp. 113–129. Springer, Cham (2017). Scholar
  19. 19.
    Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: AB testing for process versions with contextual multi-armed bandit algorithms. In: Krogstie, J., Reijers, H.A. (eds.) CAiSE 2018. LNCS, vol. 10816, pp. 19–34. Springer, Cham (2018). Scholar
  20. 20.
    Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: Business process improvement with the AB-BPM methodology. Inf. Syst. (2018)Google Scholar
  21. 21.
    Schermann, G., Cito, J., Leitner, P., Zdun, U., Gall, H.C.: We’re doing it live: a multi-method empirical study on continuous experimentation. Inf. Softw. Technol. (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Data61, CSIROSydneyAustralia
  2. 2.University of New South WalesSydneyAustralia
  3. 3.Vienna University of Economics and BusinessViennaAustria

Personalised recommendations