Validating Converted Java Code via Symbolic Execution

Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 269)


The testing approach described here has grown out of migration projects aimed at converting procedural programs in COBOL or PL/1 to object-oriented Java code. The code conversion itself is now automated but not completely. The human reengineer still has to make some adjustments to the automatically generated code and that can lead to errors. These may also be subtle errors in the automated transformation. Therefore, converted code must be tested to prove that it is functionally equivalent to the original code. Up until now converted programs have been tested manually and their results compared, but that is a very labor intensive approach. Besides, it only shows which results differ and not where the code differs. It can be extremely difficult to trace differences in the results back to differences in the code. Such regression testing drives up the costs of migration, causing users to disregard this alternative. If they have to spend so much on testing a conversion they might as well redevelop the software. This paper describes how converted code can be validated at a much lower cost by symbolically executing it and comparting the execution paths. The theory behind this approach is that no matter how statements are statically reordered, dynamically they must still execute in the same sequence to produce the same result.


Code conversion Object-oriented migration Functional equivalence Source code animation Symbolic execution Dynamic comparison Verification paths 


  1. 1.
    Terekhov, A., Verhoef, C.: Realities of language conversion. IEEE Softw. 17(6), 111 (2000)CrossRefGoogle Scholar
  2. 2.
    Sneed, H.: Migrating from COBOL to Java – a report from the field. In: IEEE Proceedings of 26th ICSM, p. 122. Computer Society Press, Temesvar, September 2010Google Scholar
  3. 3.
    Yang, H., Zedan, H.: Abstraction – a key notion for reverse engineering in a system reengineering approach. J. Softw. Maint. 12(4), 197 (2000)CrossRefGoogle Scholar
  4. 4.
    Sneed, H.: Encapsulating legacy software for reuse in client-server systems. In: IEEE 3rd WCRE, p. 104. Computer Society Press, Monterey (1996)Google Scholar
  5. 5.
    Meyer, B.: Object-oriented Software Construction, p. 12. Prentice-Hall International, Hertfordshire (1988)Google Scholar
  6. 6.
    Sneed, H.: Transforming procedural program structures to object-oriented class structures. In: IEEE Proceedings of 18th ICSM, p. 286. Computer Society Press, Montreal, October 2001Google Scholar
  7. 7.
    Sellink, A., Sneed, H., Verhoef, C.: Restructuring of COBOL/CICS legacy systems. Sci. Comput. Program. 45(2–3), 193 (2002)CrossRefMATHGoogle Scholar
  8. 8.
    Ruhl, M., Gunn, M.: Software reengineering – a case study and lessons learned, pp. 500–193. NIST Special Publication, September 1991Google Scholar
  9. 9.
    Veerman, N.: Revitalizing modifiability of legacy assets. J. Softw. Maint. Evol. 16, 219 (2004)CrossRefGoogle Scholar
  10. 10.
    Veerman, N.: Towards lightweight checks for mass maintenance transformations. Sci. Comput. Program. 57(2), 129–163 (2005)MathSciNetCrossRefMATHGoogle Scholar
  11. 11.
    Broy, M.: Zur spezifikation von programmen für die textverarbeitung. In: Wossido, P. (ed.) Textverarbeitung und Informatik. Informatik Fachberichte, vol. 30, p. 75. Springer, Heidelberg (1980)CrossRefGoogle Scholar
  12. 12.
    Lano, K., Breuer, P., Haughton, H.: Reverse engineering of COBOL via formal methods. J. Softw. Maint. 5(1), 13 (1993)CrossRefGoogle Scholar
  13. 13.
    Howden, W.: Symbolic testing with the DISSECT symbolic evaluation system. IEEE Trans. 1(4), 266 (1977)MATHGoogle Scholar
  14. 14.
    Howden, W.: Functional program testing. IEEE Trans. 6(3), 162 (1980)Google Scholar
  15. 15.
    Rich, C., Waters, R.C.: The programmer's apprentice: a research overview. IEEE Comput. Mag. 21(11), 10–25 (1988)CrossRefGoogle Scholar
  16. 16.
    Swartout, V., Balzer, R.: On the inevitable intertwining of specification and implementation. Commun. ACM 25(7), 438–440 (1982)CrossRefGoogle Scholar
  17. 17.
    Basili, V., Selby, R.: Comparing the effectiveness of software testing strategies. IEEE Trans. 13(12), 1278 (1987)Google Scholar
  18. 18.
    El-Fakih, K., Yevtushenko, N., Bochmann, G.: FSM-based incremental conformance testing methods. IEEE Trans. 30(7), 425 (2004)Google Scholar
  19. 19.
    Sneed, H.: Validating functional equivalence of reengineered programs via control path, result and data flow comparison. Softw. Test. Verif. Reliab. 4(1), 33 (1994)CrossRefGoogle Scholar
  20. 20.
    Sneed, H.: Risks involved in reengineering projects. In: IEEE Proceedings of 6th WCRE, p. 204. Computer Society Press, Atlanta, October 1999Google Scholar
  21. 21.
    Leung, N., White, L.: Insights into regression testing. In: IEEE Proceedings of 5th ICSM, p. 60. Computer Society Press, Miami, November 1989Google Scholar
  22. 22.
    Weyuker, E.: The cost of data flow testing – an empirical study. IEEE Trans. 16(2), 121 (1990)MathSciNetGoogle Scholar
  23. 23.
    Korel, B., Laski, J.: Dynamic slicing of computer programs. J. Syst. Softw. 13(3), 187 (1990)CrossRefMATHGoogle Scholar
  24. 24.
    Stevens, S.: Intelligent interactive video simulation of a code inspection. Commun. ACM 32(7), 832 (1989)CrossRefGoogle Scholar
  25. 25.
    Cimitile, A., Fasolino, A.: Reuse reengineering and validation via concept assignment. In: IEEE Proceedings of 9th ICSM, p. 216. Computer Society Press, Montreal, September 1993Google Scholar
  26. 26.
    Howden, W.: Reliability of the path analysis testing strategy. IEEE Trans. 1(4), 208 (1976)MathSciNetMATHGoogle Scholar
  27. 27.
    Peters, D., Parnas, D.: Using test oracles generated from program documentation. IEEE Trans. 24(3), 161 (1998)Google Scholar
  28. 28.
    Sneed, H.: Source animation as a means of program comprehension for object-oriented systems. In: Proceedings of 8th IEEE International Workshop on Program Comprehension, p. 179. Computer Society Press, Limerick, June 2000Google Scholar
  29. 29.
    Cornelissen, B., Zaidman, A., van Deursen, A.: A controlled experiment for program comprehension through trace visualization. IEEE Trans. 37(3), 341 (2011)Google Scholar
  30. 30.
    Gilb, T., Graham, D.: Software Inspection Techniques. Addison-Wesley, Wokingham (1993)Google Scholar
  31. 31.
    Verhoef, C.: Towards automated modification of legacy assets. Ann. Softw. Eng. 9, 315 (2000)CrossRefGoogle Scholar
  32. 32.
    Klusener, S., Lämmel, R., Verhoef, C.: Architectural modifications to deployed software. Sci. Comput. Program. 54, 143 (2005)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Technical University of DresdenDresdenGermany
  2. 2.Department of Computer ScienceVU AmsterdamAmsterdamThe Netherlands

Personalised recommendations