Shadow Testing for Business Process Improvement
A fundamental assumption of improvement in Business Process Management (BPM) is that redesigns deliver refined and improved versions of business processes. These improvements can be validated online through sequential experiment techniques like AB Testing, as we have shown in earlier work. Such approaches have the inherent risk of exposing customers to an inferior process version during the early stages of the test. This risk can be managed by offline techniques like simulation. However, offline techniques do not validate the improvements because there is no user interaction with the new versions. In this paper, we propose a middle ground through shadow testing, which avoids the downsides of simulation and direct execution. In this approach, a new version is deployed and executed alongside the current version, but in such a way that the new version is hidden from the customers and process workers. Copies of user requests are partially simulated and partially executed by the new version as if it were running in the production. We present an architecture, algorithm, and implementation of the approach, which isolates new versions from production, facilitates fair comparison, and manages the overhead of running shadow tests. We demonstrate the efficacy of our technique by evaluating the executions of synthetic and realistic process redesigns.
KeywordsShadow testing Business process management DevOps Live testing
The work of Claudio Di Ciccio and Jan Mendling has received funding from the EU H2020 programme under MSCA-RISE agreement 645751 (RISE_BPM).
- 2.Bass, L., Weber, I., Zhu, L.: DevOps - A Software Architect’s Perspective. SEI Series in Software Engineering. Addison-Wesley, Boston (2015)Google Scholar
- 4.Crook, T., Frasca, B., Kohavi, R., Longbotham, R.: Seven pitfalls to avoid when running controlled experiments on the web. In: KDD, pp. 1105–1114 (2009)Google Scholar
- 5.Davenport, T.H.: Process Innovation: Reengineering Work Through Information Technology. Harvard Business Press, Boston (1993)Google Scholar
- 6.Denery, D.G., Erzberger, H.: The center-TRACON automation system: simulation and field testing (1995)Google Scholar
- 8.Falk, T., Griesberger, P., Leist, S.: Patterns as an artifact for business process improvement - insights from a case study. In: vom Brocke, J., Hekkala, R., Ram, S., Rossi, M. (eds.) DESRIST 2013. LNCS, vol. 7939, pp. 88–104. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-38827-9_7CrossRefGoogle Scholar
- 10.Hammer, M., Champy, J.: Reengineering the Corporation: A Manifesto for Business Revolution. HarperCollins, New York (1993)Google Scholar
- 11.Holland, C.W.: Breakthrough Business Results With MVT: A Fast, Cost-Free “Secret Weapon" for Boosting Sales, Cutting Expenses, and Improving Any Business Process. Wiley, Hoboken (2005)Google Scholar
- 13.Kevic, K., Murphy, B., Williams, L.A., Beckmann, J.: Characterizing experimentation in continuous deployment: a case study on bing. In: ICSE-SEIP, pp. 123–132. IEEE Press (2017)Google Scholar
- 18.Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: AB-BPM: performance-driven instance routing for business process improvement. In: Carmona, J., Engels, G., Kumar, A. (eds.) BPM 2017. LNCS, vol. 10445, pp. 113–129. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-65000-5_7CrossRefGoogle Scholar
- 19.Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: AB testing for process versions with contextual multi-armed bandit algorithms. In: Krogstie, J., Reijers, H.A. (eds.) CAiSE 2018. LNCS, vol. 10816, pp. 19–34. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91563-0_2CrossRefGoogle Scholar
- 20.Satyal, S., Weber, I., Paik, H., Di Ciccio, C., Mendling, J.: Business process improvement with the AB-BPM methodology. Inf. Syst. (2018)Google Scholar
- 21.Schermann, G., Cito, J., Leitner, P., Zdun, U., Gall, H.C.: We’re doing it live: a multi-method empirical study on continuous experimentation. Inf. Softw. Technol. (2018)Google Scholar