WYSINWYX: What You See Is Not What You eXecute

  • G. Balakrishnan
  • T. Reps
  • D. Melski
  • T. Teitelbaum
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4171)


What You See Is Not What You eXecute: computers do not execute source-code programs; they execute machine-code programs that are generated from source code. Not only can the WYSINWYX phenomenon create a mismatch between what a programmer intends and what is actually executed by the processor, it can cause analyses that are performed on source code to fail to detect certain bugs and vulnerabilities. This issue arises regardless of whether one’s favorite approach to assuring that programs behave as desired is based on theorem proving, model checking, or abstract interpretation.


  1. 1.
    PREfast with driver-specific rules, Windows Hardware and Driver Central (WHDC) (October, 2004), http://www.microsoft.com/whdc/devtools/tools/PREfast-drv.mspx
  2. 2.
    Amme, W., Braun, P., Zehendner, E., Thomasset, F.: Data dependence analysis of assembly code. Int. J. Parallel Proc (2000)Google Scholar
  3. 3.
    Balakrishnan, G., Reps, T.: Analyzing memory accesses in x86 executables. In: Comp. Construct. pp. 5–23 (2004)Google Scholar
  4. 4.
    Ball, T., Rajamani, S.K.: The SLAM toolkit. In: Berry, G., Comon, H., Finkel, A. (eds.) CAV 2001. LNCS, vol. 2102, pp. 260–264. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  5. 5.
    Boehm, H.-J.: Threads cannot be implemented as a library. In: PLDI, pp. 261–268 (2005)Google Scholar
  6. 6.
    Bouajjani, A., Esparza, J., Maler, O.: Reachability analysis of pushdown automata: Application to model checking. In: Mazurkiewicz, A., Winkowski, J. (eds.) CONCUR 1997. LNCS, vol. 1243, Springer, Heidelberg (1997)Google Scholar
  7. 7.
    Bouajjani, A., Esparza, J., Touili, T.: A generic approach to the static analysis of concurrent programs with procedures. In: POPL, pp. 62–73 (2003)Google Scholar
  8. 8.
    Bush, W., Pincus, J., Sielaff, D.: A static analyzer for finding dynamic programming errors. Software–Practice&Experience 30, 775–802 (2000)MATHGoogle Scholar
  9. 9.
    Chen, H., Wagner, D.: MOPS: An infrastructure for examining security properties of software. In: Conf. on Comp. and Commun. Sec, November 2002, pp. 235–244 (2002)Google Scholar
  10. 10.
    Cifuentes, C., Fraboulet, A.: Intraprocedural static slicing of binary executables. Int. Conf. on Softw. Maint. 188–195 (1997)Google Scholar
  11. 11.
  12. 12.
    Corbett, J.C., Dwyer, M.B., Hatcliff, J., Laubach, S., Pasareanu, C.S.: Bandera: Extracting finite-state models from Java source code. In: ICSE (2000)Google Scholar
  13. 13.
    Cousot, P., Cousot, R.: Abstract interpretation: A unified lattice model for static analysis of programs by construction of approximation of fixed points. In: POPL (1977)Google Scholar
  14. 14.
    Coutant, D.S., Meloy, S., Ruscetta, M.: DOC: A practical approach to source-level debugging of globally optimized code. In: PLDI (1988)Google Scholar
  15. 15.
    Das, M., Lerner, S., Seigle, M.: ESP: Path-sensitive program verification in polynomial time. In: PLDI (2002)Google Scholar
  16. 16.
    Debray, S.K., Muth, R., Weippert, M.: Alias analysis of executable code. In: POPL (1998)Google Scholar
  17. 17.
    Dwyer, M., Avrunin, G., Corbett, J.: Patterns in property specifications for finite-state verification. In: ICSE (1999)Google Scholar
  18. 18.
    Engler, D.R., Chelf, B., Chou, A., Hallem, S.: Checking system rules using system-specific, programmer-written compiler extensions. In: Op. Syst. Design and Impl. pp. 1–16 (2000)Google Scholar
  19. 19.
    Finkel, A., Willems, B., Wolper, P.: A direct symbolic approach to model checking pushdown systems. Elec. Notes in Theor. Comp. Sci. 9 (1997)Google Scholar
  20. 20.
    Gerth, R.: Formal verification of self modifying code. In: Proc. Int. Conf. for Young Computer Scientists, pp. 305–313 (1991)Google Scholar
  21. 21.
    Gopan, D., Reps, T., Sagiv, M.: A framework for numeric analysis of array operations. In: POPL, pp. 338–350 (2005)Google Scholar
  22. 22.
    Guo, B., Bridges, M.J., Triantafyllis, S., Ottoni, G., Raman, E., August, D.I.: Practical and accurate low-level pointer analysis. In: 3nd Int. Symp. on Code Gen. and Opt (2005)Google Scholar
  23. 23.
    Havelund, K., Pressburger, T.: Model checking Java programs using Java PathFinder. Softw. Tools for Tech. Transfer 2(4) (2000)Google Scholar
  24. 24.
    Hennessy, J.L.: Symbolic debugging of optimized code. Trans. on Prog. Lang. and Syst. 4(3), 323–344 (1982)CrossRefMATHGoogle Scholar
  25. 25.
    Henzinger, T.A., Jhala, R., Majumdar, R., Sutre, G.: Lazy abstraction. In: POPL, pp. 58–70 (2002)Google Scholar
  26. 26.
    Horwitz, S., Reps, T., Binkley, D.: Interprocedural slicing using dependence graphs. Trans. on Prog. Lang. and Syst. 12(1), 26–60 (1990)CrossRefGoogle Scholar
  27. 27.
    Howard, M.: Some bad news and some good news. In: MSDN (October, 2002), http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dncode/html/secure10102002.asp
  28. 28.
  29. 29.
    Kidd, N., Reps, T., Melski, D., Lal, A.: WPDS++: A C++ library for weighted pushdown systems (2004), http://www.cs.wisc.edu/wpis/wpds++/
  30. 30.
    Lal, A., Reps, T.: Improving pushdown system model checking. In: Ball, T., Jones, R.B. (eds.) CAV 2006. LNCS, vol. 4144, pp. 343–357. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  31. 31.
    Lal, A., Reps, T., Balakrishnan, G.: Extended weighted pushdown systems. In: Etessami, K., Rajamani, S.K. (eds.) CAV 2005. LNCS, vol. 3576, pp. 434–448. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  32. 32.
    Necula, G.: Translation validation for an optimizing compiler. In: PLDI (2000)Google Scholar
  33. 33.
    Pnueli, A., Siegel, M., Singerman, E.: Translation validation. In: Steffen, B. (ed.) ETAPS 1998 and TACAS 1998. LNCS, vol. 1384, Springer, Heidelberg (1998)CrossRefGoogle Scholar
  34. 34.
    Reps, T., Balakrishnan, G., Lim, J.: Intermediate-representation recovery from low-level code. In: Part. Eval. and Semantics-Based Prog. Manip (2006)Google Scholar
  35. 35.
    Reps, T., Schwoon, S., Jha, S.: Weighted pushdown systems and their application to interprocedural dataflow analysis. In: Cousot, R. (ed.) SAS 2003. LNCS, vol. 2694, Springer, Heidelberg (2003)CrossRefGoogle Scholar
  36. 36.
    Reps, T., Schwoon, S., Jha, S., Melski, D.: Weighted pushdown systems and their application to interprocedural dataflow analysis. Sci. of Comp. Prog. 58(1–2), 206–263 (2005)MathSciNetCrossRefMATHGoogle Scholar
  37. 37.
    Rival, X.: Abstract interpretation based certification of assembly code. In: Zuck, L.D., Attie, P.C., Cortesi, A., Mukhopadhyay, S. (eds.) VMCAI 2003. LNCS, vol. 2575, pp. 41–55. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  38. 38.
  39. 39.
    Schwoon, S.: Model-Checking Pushdown Systems. PhD thesis, Technical Univ. of Munich, Munich, Germany (July, 2002)Google Scholar
  40. 40.
    Sharir, M., Pnueli, A.: Two approaches to interprocedural data flow analysis. In: Muchnick, S.S., Jones, N.D. (eds.) Program Flow Analysis: Theory and Applications, ch. 7, pp. 189–234. Prentice-Hall, Englewood Cliffs (1981)Google Scholar
  41. 41.
    Wagner, D., Foster, J., Brewer, E., Aiken, A.: A first step towards automated detection of buffer overrun vulnerabilities. In: Network and Dist. Syst. Security (February, 2000)Google Scholar
  42. 42.
    Wall, D.W.: Systems for late code modification. In: Giegerich, R., Graham, S.L. (eds.) Code Generation – Concepts, Tools, Techniques, pp. 275–293. Springer, Heidelberg (1992)CrossRefGoogle Scholar
  43. 43.
    Wilson, R.P., Lam, M.S.: Efficient context-sensitive pointer analysis for C programs. In: PLDI, pp. 1–12 (1995)Google Scholar
  44. 44.
    Zellweger, P.T.: Interactive Source-Level Debugging of Optimized Programs. PhD thesis, Univ. of California, Berkeley (1984)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • G. Balakrishnan
    • 1
  • T. Reps
    • 1
    • 2
  • D. Melski
    • 2
  • T. Teitelbaum
    • 2
  1. 1.Comp. Sci. Dept.University of WisconsinUSA
  2. 2.GrammaTech, Inc.USA

Personalised recommendations