Testing the consistency of business data objects using extended static testing of CRUD matrices

  • Miroslav Bures
  • Tomas Cerny
  • Karel Frajtak
  • Bestoun S. Ahmed
Article

Abstract

Static testing is used to detect software defects in the earlier phases of the software development lifecycle, which makes the total costs caused by defects lower and the software development project less risky. Different types of static testing have been introduced and are used in software projects. In this paper, we focus on static testing related to data consistency in a software system. In particular, we propose extensions to contemporary static testing techniques based on CRUD matrices, employing cross-verifications between various types of CRUD matrices made by different parties at various stages of the software project. Based on performed experiments, the proposed static testing technique significantly improves the consistency of Data Cycle Test cases. Together with this trend, we observe growing potential of test cases to detect data consistency defects in the system under test, when utilizing the proposed technique.

Keywords

Static testing Data consistency testing Data Cycle Test CRUD matrix 

References

  1. 1.
    Arévalo, G., Falleri, J.R., Huchard, M., Nebut, C.: Building abstractions in class models: formal concept analysis in a model-driven approach. In: MoDELS, vol. 4199, pp. 513–527. Springer, Berlin (2006)Google Scholar
  2. 2.
    Awad, A., Decker, G., Lohmann, N.: Diagnosing and Repairing Data Anomalies in Process Models, pp. 5–16. Springer, Berlin (2010). doi:10.1007/978-3-642-12186-9-2
  3. 3.
    Briand, L., Labiche, Y., Lin, Q.: Improving the coverage criteria of uml state machines using data flow analysis. Softw. Test. Verif. Reliab. 20(3), 177–207 (2010). doi:10.1002/stvr.v20:3
  4. 4.
    Briand, L., Labiche, Y., Liu, Y.: Combining uml sequence and state machine diagrams for data-flow based integration testing. In: Proceedings of the 8th European Conference on Modelling Foundations and Applications, ECMFA’12, pp. 74–89. Springer, Berlin (2012). doi:10.1007/978-3-642-31491-9-8
  5. 5.
    Briand, L.C., Labiche, Y., Lin, Q.: Improving statechart testing criteria using data flow information. In: 16th IEEE International Symposium on Software Reliability Engineering (ISSRE’05), pp. 10–104 (2005). doi:10.1109/ISSRE.2005.24
  6. 6.
    Bures, M., Cerny, T., Klima, M.: Prioritized Process Test: More Efficiency in Testing of Business Processes and Workflows, pp. 585–593. Springer, Singapore (2017). doi:10.1007/978-981-10-4154-9-67
  7. 7.
    Carbonnel, J., Huchard, M., Miralles, A., Nebut, C.: Feature model composition assisted by formal concept analysis. In: 12th International Conference on Evaluation of Novel Approaches to Software Engineering (ENASE), pp. 28–29 (2017)Google Scholar
  8. 8.
    Cellier, P., Ducassé, M., Ferré, S., Ridoux, O.: Formal concept analysis enhances fault localization in software. Lect. Notes Comput. Sci. 4933, 273–288 (2008)CrossRefMATHGoogle Scholar
  9. 9.
    Chandra, A., Singhal, A.: Study of unit and data flow testing in object-oriented and aspect-oriented programming. In: Innovation and Challenges in Cyber Security (ICICCS-INBUSH), 2016 International Conference on, pp. 245–250. IEEE (2016)Google Scholar
  10. 10.
    Denaro, G., Margara, A., Pezze, M., Vivanti, M.: Dynamic data flow testing of object oriented systems. In: Proceedings of the 37th International Conference on Software Engineering-Volume 1, pp. 947–958. IEEE Press (2015)Google Scholar
  11. 11.
    Denaro, G., Pezze, M., Vivanti, M.: On the right objectives of data flow testing. In: Software Testing, Verification and Validation (ICST), 2014 IEEE Seventh International Conference on, pp. 71–80. IEEE (2014)Google Scholar
  12. 12.
    Dwarakanath, A., Jankiti, A.: Minimum number of test paths for prime path and other structural coverage criteria. In: Proceedings of the 26th IFIP WG 6.1 International Conference on Testing Software and Systems—Volume 8763, ICTSS 2014, pp. 63–79. Springer, New York Inc., New York (2014). doi:10.1007/978-3-662-44857-1-5
  13. 13.
    Frajtak, K., Bures, M., Jelinek, I.: Exploratory testing supported by automated reengineering of model of the system under test. Clust. Comput. 20(1), 855–865 (2017). doi:10.1007/s10586-017-0773-z
  14. 14.
    Grood, D.J.D.: TestGoal: Result-Driven Testing, 1st edn. Springer Publishing Company, Heidelberg (2008)Google Scholar
  15. 15.
    Hema, M., Anup, S., Sen, K., Bagchi, A.: Detecting data flow errors in workflows: a systematic graph traversal approach (2007)Google Scholar
  16. 16.
    Jorgensen, P.C.: Software testing: a craftsmans approach. CRC Press, Hoboken (2016)Google Scholar
  17. 17.
    Jukic, B., Jukic, N., Nestorov, S.: Process and data logic integration: Logical links between uml use case narratives and er diagrams. J. Comput. Inf. Technol. 21(3), 161–170 (2013)CrossRefGoogle Scholar
  18. 18.
    Koomen, T., Aalst, L.V.D., Broekman, B., Vroon, M.: TMap Next, for Result-driven Testing. UTN Publishers, ’s-Hertogenbosch (2013)Google Scholar
  19. 19.
    Kumar, S., Yadav, D., Khan, D.: Artificial bee colony based test data generation for data-flow testing. Indian J. Sci. Technol. 9(39) (2016)Google Scholar
  20. 20.
    Küster, J.M., Ryndina, K., Gall, H.: Generation of business process models for object life cycle compliance. In: Proceedings of the 5th International Conference on Business Process Management, BPM’07, pp. 165–181. Springer, Berlin (2007). http://dl.acm.org/citation.cfm?id=1793114.1793131
  21. 21.
    Li, N., Li, F., Offutt, J.: Better algorithms to minimize the cost of test paths. In: Proceedings of the 2012 IEEE Fifth International Conference on Software Testing, Verification and Validation, ICST ’12, pp. 280–289. IEEE Computer Society, Washington, DC (2012). doi:10.1109/ICST.2012.108
  22. 22.
    Moser, S., Martens, A., Gorlach, K., Amme, W., Godlinski, A.: Advanced verification of distributed WS-BPEL business processes incorporating CSSA-based data flow analysis. In: IEEE International Conference on Services Computing (SCC 2007), pp. 98–105 (2007). doi:10.1109/SCC.2007.22
  23. 23.
    Nielson, F., Nielson, H.R., Hankin, C.: Principles of program analysis. Springer, Berlin (2015)MATHGoogle Scholar
  24. 24.
    Poelmans, J., Dedene, G., Snoeck, M., Viaene, S.: Using formal concept analysis for the verification of process-data matrices in conceptual domain models. In: Proceedings of the IASTED International Conference on Software Engineering, pp. 79–86. Acta Press (2010)Google Scholar
  25. 25.
    Prabu, M., Narasimhan, D., Raghuram, S.: An effective tool for optimizing the number of test paths in data flow testing for anomaly detection. In: Computational Intelligence, Cyber Security and Computational Models, pp. 505–518. Springer, Berlin (2016)Google Scholar
  26. 26.
    Su, T., Wu, K., Miao, W., Pu, G., He, J., Chen, Y., Su, Z.: A survey on data-flow testing. ACM Comput. Surv. 50(1), 5 (2017)CrossRefGoogle Scholar
  27. 27.
    Sun, S.X., Zhao, J.L., Nunamaker, J.F., Sheng, O.R.L.: Formulating the data-flow perspective for business process management. Inf. Syst. Res. 17(4), 374–391 (2006). doi:10.1287/isre.1060.0105
  28. 28.
    Sundari, M.H., Sen, A.K., Bagchi, A.: Detecting data flow errors in workflows: a systematic graph traversal approach. In: WITS 2007—Proceedings, 17th Annual Workshop on Information Technologies and Systems, pp. 133–139 (2007). www.scopus.com
  29. 29.
    Tilley, T., Cole, R., Becker, P., Eklund, P.: A survey of formal concept analysis support for software engineering activities. Formal Concept Anal. 3626, 250–271 (2005)MATHGoogle Scholar
  30. 30.
    Trčka, N., van der Aalst, W.M.P., Sidorova, N.: Data-Flow Anti-Patterns: Discovering Data-Flow Errors in Workflows, pp. 425–439. Springer, Berlin (2009)Google Scholar
  31. 31.
    Waheed, S.Z., Qamar, U.: Data flow based test case generation algorithm for object oriented integration testing. In: Software Engineering and Service Science (ICSESS), 2015 6th IEEE International Conference on, pp. 423–427. IEEE (2015)Google Scholar
  32. 32.
    Wedyan, F., Ghosh, S., Vijayasarathy, L.R.: An approach and tool for measurement of state variable based data-flow test coverage for aspect-oriented programs. Information and Software Technology 59, 233 – 254 (2015). doi:10.1016/j.infsof.2014.11.008. http://www.sciencedirect.com/science/article/pii/S0950584914002547

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Department of Computer ScienceFEE Czech Technical University in PraguePragueCzech Republic
  2. 2.Department of Computer ScienceECS Baylor UniversityWacoUSA
  3. 3.College of EngineeringSalahaddin University-ErbilKurdistanIraq

Personalised recommendations