Advertisement

Static Testing Using Different Types of CRUD Matrices

  • Miroslav BuresEmail author
  • Tomas CernyEmail author
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 424)

Abstract

Static testing leads to early detection of defects throughout a project software development. This results in reduced costs and risks in the development process. Various types of static tests can be performed. In this paper, we propose extensions to contemporary static testing techniques based on CRUD matrices. In particular, we consider cross-verification between various types of CRUD matrices made by different parties at different stages of the project. This leads into extended the verification consistency of a CRUD matrix. In our evaluation, proposed techniques lead to significantly more consistent test Data Cycle Test cases, when involving our static testing techniques. Moreover, our results indicate positive impact on lowering the number of defects that usually remain undetected under the system test.

Keywords

Test Designer Test Basis System Under Test Data Entity Business Process Model Notation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Koomen, T., van der Aalst, B., Brokeman, M., Vroon, M.: TMap Next, for Result-Driven Testing, pp. 625–627. UTN Publishers, Den Bosch (2013)Google Scholar
  2. 2.
    Martin, J.: Information Engineering. Prentice Hall, Englewood Cliffs (1990)Google Scholar
  3. 3.
    De Grood, D.: TestGoal: Result-Driven Testing, pp. 208–210. Springer, Heidelberg (2008)Google Scholar
  4. 4.
    Van Veenendall, E.L.: The Testing Practitioner, pp. 241–243. UTN Publishers, Den Bosch (2002)Google Scholar
  5. 5.
    Trčka, N., Aalst, Wil, M.,P., Sidorova, N.: Data-flow anti-patterns: discovering data-flow errors in workflows. In: Eck, P., Gordijn, J., Wieringa, R. (eds.) CAiSE 2009. LNCS, vol. 5565, pp. 425–439. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-02144-2_34 CrossRefGoogle Scholar
  6. 6.
    Sundari, M.H., et al.: Detecting data flow errors in workflows: a systematic graph traversal approach. In: 17th Workshop on Information Technology and Systems (2007)Google Scholar
  7. 7.
    Ryndina, K., Küster, Jochen, M., Gall, H.: Consistency of business process models and object life cycles. In: Kühne, T. (ed.) MODELS 2006. LNCS, vol. 4364, pp. 80–90. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-69489-2_11 CrossRefGoogle Scholar
  8. 8.
    Awad, A., Decker, G., Lohmann, N.: Diagnosing and repairing data anomalies in process models. In: Rinderle-Ma, S., Sadiq, S., Leymann, F. (eds.) BPM 2009. LNBIP, vol. 43, pp. 5–16. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-12186-9_2 CrossRefGoogle Scholar
  9. 9.
    Moser, S., Martens, A., Gorlach, K., Amme, W., Godlinski, A.: Advanced verification of distributed WS-BPEL business processes incorporating CSSA-based data flow analysis. In: International Conference on Services Computing, pp. 98–105. IEEE (2007)Google Scholar
  10. 10.
    Poelmans, J., Dedene, G., Snoeck, M., Viaene, S.: Using formal concept analysis for the verification of process-data matrices in conceptual domain models. In: Proceedings of the IASTED International Conference on Software Engineering, pp. 79–86. Acta Press (2010)Google Scholar
  11. 11.
    Sun, S.X., et al.: Formulating the data flow perspective for business process management. Inf. Syst. Res. 17(4), 374–391 (2006)CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  1. 1.Department of Computer Science, FEECzech Technical University in PraguePragueCzech Republic
  2. 2.Department of Computer ScienceBaylor UniversityWacoUSA

Personalised recommendations