Advertisement

Journal in Computer Virology

, Volume 2, Issue 4, pp 275–289 | Cite as

Using a virtual security testbed for digital forensic reconstruction

  • André ÅrnesEmail author
  • Paul Haas
  • Giovanni Vigna
  • Richard A. Kemmerer
Original Paper

Abstract

This paper presents ViSe, a virtual security testbed, and demonstrates how it can be used to efficiently study computer attacks and suspect tools as part of a computer crime reconstruction. Based on a hypothesis of the security incident in question, ViSe is configured with the appropriate operating systems, services, and exploits. Attacks are formulated as event chains and replayed on the testbed. The effects of each event are analyzed in order to support or refute the hypothesis. The purpose of the approach is to facilitate reconstruction experiments in digital forensics. Two examples are given to demonstrate the approach; one overview example based on the Trojan defense and one detailed example of a multi-step attack. Although a reconstruction can neither prove a hypothesis with absolute certainty nor exclude the correctness of other hypotheses, a standardized environment, such as ViSe, combined with event reconstruction and testing, can lend credibility to an investigation and can be a great asset in court.

Keywords

Intrusion Detection System Crime Scene Event Reconstruction Digital Evidence Reconstruction Experiment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Richmond, M.: ViSe: A virtual security testbed. Master’s thesis, University of California, Santa Barbara (2005)Google Scholar
  2. 2.
    National Institute of Standards and Technology: (National software reference library (NSRL)) http://www.nsrl.nist.gov/ index.htmlGoogle Scholar
  3. 3.
    Murilo, N., Steding-Jessen, K.: (chkrootkit–locally checks for signs of a rootkit) http://www.chkrootkit.org/Google Scholar
  4. 4.
    Harbour, N.: (dcfldd - latest version 1.3.4) http://dcfldd. sourceforge.net/Google Scholar
  5. 5.
    Jacobson, V., Leres, C., McCanne, S.: (tcpdump/libpcap) http://www.tcpdump.org/Google Scholar
  6. 6.
    Betz, C.: (Memparser – a memory forensics analysis tool for microsoft windows systems) http://sourceforge.net/projects/memparserGoogle Scholar
  7. 7.
    Guidance Software, Inc.: Encase www.encase.com (2006)Google Scholar
  8. 8.
    Spencer, E.: ILook investigator toolsets www.ilook-forensics.org (2006)Google Scholar
  9. 9.
    Carrier, B.: The Sleuth Kit and Autopsy www.sleuthkit.org (2006)Google Scholar
  10. 10.
    AccessData: (Accessdata forensic toolkit (FTK)) http://www. accessdata.com/products/ftk/Google Scholar
  11. 11.
    Filiol, E.: Strong cryptography armoured computer viruses forbidding code analysis: the bradley virus. In: EICAR2005 annual conference 14 (2005)Google Scholar
  12. 12.
    Carrier, B.D., Spafford, E.H.: Defining event reconstruction of digital crime scenes. J. Forensic Sci. 49 (2004)Google Scholar
  13. 13.
    Broucek V. and Turner P. (2006). Winning the battles, losing the war? rethinking methodology for forensic computing research. J. Compu. Virol. 2(1): 3–12 CrossRefGoogle Scholar
  14. 14.
    Chisum, W.J., Turvey, B.E.: Evidence dynamics: Locard’s exchange principle crime reconstruction. J. Behav. Profiling 1(1) (2000)Google Scholar
  15. 15.
    O’Connor, T.: Introduction to crime reconstruction. Lecture Notes for Criminal Investigation North Carolina Wesleyan College (2004)Google Scholar
  16. 16.
    Aitken, C., Taroni, F.: Statistics and the Evaluation of Evidence for Forensic Scientists. Wiley, London (2004)Google Scholar
  17. 17.
    Carney, M., Rogers, M.: The Trojan Made Me Do It: A first step in statistical based computer forensics event reconstruction. Int. J. Digit. Evid. 2 (2004)Google Scholar
  18. 18.
    Carrier, B.: An event-based digital forensic investigation framework. In: Digital forensic research workshop (2004)Google Scholar
  19. 19.
    Stephenson, P.: Formal modeling of post-incident root cause analysis. Int. J. Digit. Evid. 2 (2003)Google Scholar
  20. 20.
    Gladyshev, P., Patel, A.: Finite state machine approach to digital event reconstruction. Digit. Invest. 1 (2004)Google Scholar
  21. 21.
    Stallard, T.B.: Automated analysis for digital forensic science. Master’s thesis, University of California, Davis (2002)Google Scholar
  22. 22.
    Stallard, T., Levitt, K.N.: Automated analysis for digital forensic science: Semantic integrity checking. In: ACSAC 160–169 (2003)Google Scholar
  23. 23.
    Abbott, J., Bell, J., Clark, A., Vel, O.D., Mohay, G.: Auto- mated recognition of event scenarios for digital forensics. In: SAC ’06: Proceedings of the 2006 ACM symposium on applied computing pp. 293–300. ACM Press, New York (2006)Google Scholar
  24. 24.
    Elsaesser, C., Tanner, M.C.: Automated diagnosis for computer forensics. Technical report, The MITRE Corporation (2001)Google Scholar
  25. 25.
    Neuhaus, S., Zeller, A.: Isolating intrusions by automatic experiments. In: Proceedings of the 13th annual network and distributed system security symposium. pp. 71–80 (2006)Google Scholar
  26. 26.
    Baca, E.: Using linux VMware and SMART to create a virtual computer to recreate a suspect’s computer www.linux-forensics.com (2003)Google Scholar
  27. 27.
    Provos, N.: The honeyd virtual honeypot www.honeyd.org (2005)Google Scholar
  28. 28.
    Honeynet Project: Know your enemy: Learning with VMware–building virtual honeynets using VMware www.honeynet.org (2003)Google Scholar
  29. 29.
    Seifried, K.: Honeypotting with VMware www.seifried.org (2002)Google Scholar
  30. 30.
    Rossey, L., Cunningham, R., Fried, D., Rabek, J., Lippman, R., Haines, J., Zissman, M.: LARIAT: lincoln adaptable real-time information assurance testbed. In: 2002 IEEE aerospace conference proceedings (2002)Google Scholar
  31. 31.
    Haines, J., Goulet, S., Durst, R., Champion, T.: Llsim: Network simulation for correlation and response testing. In: IEEE workshop on information assurance, West Point (2003)Google Scholar
  32. 32.
    White, B., Lepreau, J., Stoller, L., Ricci, R., Guruprasad, S., Newbold, M., Hibler, M., Barb, C., Joglekar, A.: An integrated experimental environment for distributed systems and networks. In: 5th symposium on operating systems design and implementation. USENIX Association, Boston 255–260 (2002)Google Scholar
  33. 33.
    The DETER project: The DETER Testbed: Overview www.isi.edu/deter (2004)Google Scholar
  34. 34.
    Jiang, X., Xu, D., Wang, H., Spafford, E.: Virtual playgrounds for worm behavior investigation. In: 8th International symposium on recent advances in intrusion detection, Seattle (2005)Google Scholar
  35. 35.
    Dike, J.: User mode linux user-mode-linux.sourceforge.net (2005)Google Scholar
  36. 36.
    Årnes, A., Haas, P., Vigna, G., Kemmerer, R.A.: Digital forensic reconstruction and the virtual security testbed ViSe. In: proceedings of conference on detection of intrusions and malware and vulnerability assessment (DIMVA), LNCS 4064, Springer, Berlin Heidelberg New York (2006)Google Scholar
  37. 37.
    Vada, H.: Rekonstruksjon av angrep mot IKT-systemer (reconstruction of attacks on ICT systems). Master’s thesis, Norwegian University of Science and Technology, Trondheim, Norway (2004)Google Scholar
  38. 38.
    VMware: VMware 5.0 manual www.vmware.com (2005)Google Scholar
  39. 39.
    University of Cambridge Computer Laboratory: The Xen virtual machine monitor http://www.cl.cam.ac.uk/ (2005)Google Scholar
  40. 40.
    Microsoft: Microsoft Virtual PC www.microsoft.com (2004)Google Scholar
  41. 41.
    The open web application security project: The ten most critical web application security vulnerabilities. Technical report, OWASP (2004)Google Scholar
  42. 42.
    Wang, X., Feng, D., Lai, X., Yu, H.: Collisions for hash functions MD4, MD5, HAVAL-128 and RIPEMD. Cryptology ePrint Archive, Report 2004/199 (2004)Google Scholar
  43. 43.
    Honeynet Project: Detecting VMware www.honeynet.org (2005)Google Scholar
  44. 44.
    Shelton, T.: VMware Flaw in NAT Function Lets Remote Users Execute Arbitrary Code (2005) securitytracker.comGoogle Scholar
  45. 45.
    Cuff, A.: Talisker Anti Forensic Tools www.networkintrusion.co.uk (2004)Google Scholar
  46. 46.
    Leyden, J.: Trojan defence clears man on child porn charges http://www.theregister.co.uk/2003/04/24/trojan_defence_clears_man/(2003)Google Scholar
  47. 47.
    Rasch, M.: The giant wooden horse did it! http://www.securityfocus.com/columnists/208 (2004)Google Scholar
  48. 48.
    CERT: CERT Advisory CA-2003-20 W32/Blaster worm http://www.cert.org/advisories/CA-2003-20.html (2003)Google Scholar
  49. 49.
    ronvdaal@zarathustra.linux666.com: PHPBB Viewtopic.PHP remote code execution vulnerability Bugtraq ID 14086 (2005)Google Scholar
  50. 50.
    aXiS: IWConfig Local ARGV command line buffer overflow vulnerability Bugtraq ID 8901 (2003)Google Scholar
  51. 51.
    Vozeler, M.: CDRTools RSH environment variable privilege escalation vulnerability Bugtraq ID 11075 (2004)Google Scholar

Copyright information

© Springer-Verlag France 2006

Authors and Affiliations

  • André Årnes
    • 1
    Email author
  • Paul Haas
    • 2
  • Giovanni Vigna
    • 2
  • Richard A. Kemmerer
    • 2
  1. 1.Centre for Quantifiable Quality of Service in Communication SystemsNorwegian University of Science and TechnologyTrondheimNorway
  2. 2.Department of Computer ScienceUniversity of California Santa BarbaraSanta BarbaraUSA

Personalised recommendations