Open to Change: A Theory for Iterative Test-Driven Modelling

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11080)


We introduce open tests to support iterative test-driven process modelling. Open tests generalise the trace-based tests of Zugal et al. to achieve modularity: whereas a trace-based test passes if a model exhibits a particular trace, an open test passes if a model exhibits a particular trace up to abstraction from additional activities not relevant for the test. This generalisation aligns open tests better with iterative test-driven development: open tests may survive the addition of activities and rules to the model in cases where trace-based tests do not. To reduce overhead in re-running tests, we establishing sufficient conditions for a model update to preserve test outcomes. We introduce open tests in an abstract setting that applies to any process notation with trace semantics, and give our main preservation result in this setting. Finally, we instantiate the general theory for the DCR Graph process notation, obtaining a method for iterative test-driven DCR process modelling.


Test-driven modelling Abstraction Declarative DCR graphs 



We are grateful to the reviewers for their help not only to improve the presentation but also to identify interesting areas of future work.


  1. 1.
    Baeten, J.C.M., van Glabbeek, R.J.: Another look at abstraction in process algebra. In: Ottmann, T. (ed.) ICALP 1987. LNCS, vol. 267, pp. 84–94. Springer, Heidelberg (1987). Scholar
  2. 2.
    Basin, D.A., Debois, S., Hildebrandt, T.T.: In the nick of time: proactive prevention of obligation violations. In: Computer Security Foundations, pp. 120–134 (2016)Google Scholar
  3. 3.
    Beck, K.: Extreme Programming Explained: Embrace Change. Addison-Wesley Professional, Boston (2000)Google Scholar
  4. 4.
    Beck, K.: Test-driven development: by example (2003)Google Scholar
  5. 5.
    Bekendtgørelse af lov om social service. Børne- og Socialministeriet, August 2017Google Scholar
  6. 6.
    Bushnell, D.M.: Research Conducted at the Institute for Computer Applications in Science and Engineering for the Period October 1, 1999 through March 31, 2000. Technical report NASA/CR-2000-210105, NAS 1.26:210105, NASA (2000)Google Scholar
  7. 7.
    Clarke, E.M., Grumberg, O., Long, D.E.: Model checking and abstraction. ACM Trans. Program. Lang. Syst. 16(5), 1512–1542 (1994)CrossRefGoogle Scholar
  8. 8.
    Cockburn, A.: Agile Software Development, vol. 177. Addison-Wesley, Boston (2002)zbMATHGoogle Scholar
  9. 9.
    Cousot, P., Cousot, R.: Systematic design of program analysis frameworks. In: Proceedings of the 6th ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages, pp. 269–282. ACM (1979)Google Scholar
  10. 10.
    Debois, S., Hildebrandt, T.: The DCR workbench: declarative choreographies for collaborative processes. In: Behavioural Types: from Theory to Tools, pp. 99–124. River Publishers, Gistrup (2017)Google Scholar
  11. 11.
    Debois, S., Hildebrandt, T., Slaats, T.: Hierarchical declarative modelling with refinement and sub-processes. In: Sadiq, S., Soffer, P., Völzer, H. (eds.) BPM 2014. LNCS, vol. 8659, pp. 18–33. Springer, Cham (2014). Scholar
  12. 12.
    Debois, S., Hildebrandt, T.T., Slaats, T.: Replication, refinement & reachability: complexity in dynamic condition-response graphs. Acta Informatica, 1–32 (2017).
  13. 13.
    Ernst, M.D.: Static and dynamic analysis: synergy and duality. In: ICSE Workshop on Dynamic Analysis, pp. 24–27 (2003)Google Scholar
  14. 14.
    Hildebrandt, T., Mukkamala, R.R.: Declarative event-based workflow as distributed dynamic condition response graphs. In: Post-proceedings of PLACES 2010. EPTCS, vol. 69, pp. 59–73 (2010)CrossRefGoogle Scholar
  15. 15.
    Hildebrandt, T., Mukkamala, R.R., Slaats, T., Zanitti, F.: Contracts for cross-organizational workflows as timed dynamic condition response graphs. J. Log. Algebr. Program. 82(5–7), 164–185 (2013)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Hull, R., et al.: Introducing the guard-stage-milestone approach for specifying business entity lifecycles. In: Bravetti, M., Bultan, T. (eds.) WS-FM 2010. LNCS, vol. 6551, pp. 1–24. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  17. 17.
    Janzen, D., Saiedian, H.: Test-driven development concepts, taxonomy, and future direction. Computer 38(9), 43–50 (2005)CrossRefGoogle Scholar
  18. 18.
    Marquard, M., Shahzad, M., Slaats, T.: Web-based modelling and collaborative simulation of declarative processes. In: Motahari-Nezhad, H.R., Recker, J., Weidlich, M. (eds.) BPM 2015. LNCS, vol. 9253, pp. 209–225. Springer, Cham (2015). Scholar
  19. 19.
    Mei, H., Hao, D., Zhang, L., Zhang, L., Zhou, J., Rothermel, G.: A static approach to prioritizing junit test cases. IEEE Trans. Softw. Eng. 38(6), 1258–1275 (2012)CrossRefGoogle Scholar
  20. 20.
    Object Management Group: Case Management Model and Notation. Technical report formal/2014-05-05, Object Management Group, version 1.0, May 2014Google Scholar
  21. 21.
    Object Management Group BPMN Technical Committee: Business Process Model and Notation, Version 2.0 (2013)Google Scholar
  22. 22.
    Pesic, M., van der Aalst, W.M.P.: A declarative approach for flexible business processes management. In: Eder, J., Dustdar, S. (eds.) BPM 2006. LNCS, vol. 4103, pp. 169–180. Springer, Heidelberg (2006). Scholar
  23. 23.
    Pesic, M., Schonenberg, H., Van der Aalst, W.M.P.: DECLARE: full support for loosely-structured processes. In: Proceedings of the 11th IEEE International Enterprise Distributed Object Computing Conference, pp. 287–300. IEEE (2007)Google Scholar
  24. 24.
    Pnueli, A.: The temporal logic of programs. In: 18th Annual Symposium on Foundations of Computer Science, pp. 46–57 (1977)Google Scholar
  25. 25.
    Schwaber, K., Beedle, M.: Agile Software Development with Scrum, vol. 1. Prentice Hall, Upper Saddle River (2002)zbMATHGoogle Scholar
  26. 26.
    Slaats, T.: Flexible Process Notations for Cross-organizational Case Management Systems. Ph.D. thesis, IT University of Copenhagen, January 2015Google Scholar
  27. 27.
    Zhang, L., Zhou, J., Hao, D., Zhang, L., Mei, H.: Prioritizing JUnit test cases in absence of coverage information. In: Software Maintenance, pp. 19–28. IEEE (2009)Google Scholar
  28. 28.
    Zugal, S., Pinggera, J., Weber, B.: The impact of testcases on the maintainability of declarative process models. In: Halpin, T., et al. (eds.) BPMDS/EMMSAD -2011. LNBIP, vol. 81, pp. 163–177. Springer, Heidelberg (2011). Scholar
  29. 29.
    Zugal, S., Pinggera, J., Weber, B.: Creating declarative process models using test driven modeling suite. In: Nurcan, S. (ed.) CAiSE Forum 2011. LNBIP, vol. 107, pp. 16–32. Springer, Heidelberg (2012). Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.University of CopenhagenCopenhagenDenmark
  2. 2.IT University of CopenhagenCopenhagenDenmark

Personalised recommendations