Empirical Study to Fingerprint Public Malware Analysis Services

  • Álvaro BotasEmail author
  • Ricardo J. Rodríguez
  • Vicente Matellán
  • Juan F. García
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 649)


The evolution of malicious software (malware) analysis tools provided controlled, isolated, and virtual environments to analyze malware samples. Several services are found on the Internet that provide to users automatic system to analyze malware samples, as VirusTotal, Jotti, or ClamAV, to name a few. Unfortunately, malware is currently incorporating techniques to recognize execution onto a virtual or sandbox environment. When analysis environment is detected, malware behave as a benign application or even show no activity. In this work, we present an empirical study and characterization of automatic public malware analysis services. In particular, we consider 26 different services. We also show a set of features that allow to easily fingerprint these services as analysis environments. Finally, we propose a method to mitigate fingerprinting.



The research of A. Botas, V. Matellán, and J.F. García was supported by INCIBE according to the rule 19 of the Digital Confidence Plan and by the University of León under contract X43. The research of R.J. Rodríguez was supported in part by the Spanish MINECO project CyCriSec (TIN2014-58457-R), by University of Zaragoza and Centro Universitario de la Defensa project UZCUD2016-TEC-06, and by “Ayudas para estancias de Investigadores visitantes en el CEI Triangular-E3” (hosted by University of León).


  1. 1.
    Balzarotti, D., Cova, M., Karlberger, C., Kirda, E., Kruegel, C., Vigna, G.: Efficient detection of split personalities in malware. In: NDSS 2010 (2010)Google Scholar
  2. 2.
    Blackthorne, J., Bulazel, A., Fasano, A., Biernat, P., Yener, B.: AVLeak: fingerprinting antivirus emulators through black-box testing. In: Proceedings of the 10th USENIX Workshop on Offensive Technologies. USENIX Association (2016)Google Scholar
  3. 3.
    Chen, P., Huygens, C., Desmet, L., Joosen, W.: Advanced or not? A comparative study of the use of anti-debugging and Anti-VM techniques in generic and targeted malware. In: Hoepman, J.-H., Katzenbeisser, S. (eds.) SEC 2016. IAICT, vol. 471, pp. 323–336. Springer, Cham (2016). doi: 10.1007/978-3-319-33630-5_22 CrossRefGoogle Scholar
  4. 4.
    Chen, X., Andersen, J., Mao, Z.M., Bailey, M., Nazario, J.: Towards an understanding of anti-virtualization and anti-debugging behavior in modern malware. In: DSN 2008, pp. 177–186, June 2008Google Scholar
  5. 5.
    Ferrand, O.: How to detect the Cuckoo Sandbox and to strengthen it? J. Comput. Virol. Hacking Tech. 11(1), 51–58 (2015)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Garfinkel, T., Adams, K., Warfield, A., Franklin, J.: Compatibility is not transparency: VMM detection myths and realities. In: Proceedings of the 11th USENIX Workshop on Hot Topics in Operating Systems, pp. 6:1–6:6. USENIX Association (2007)Google Scholar
  7. 7.
    Kirat, D., Vigna, G.: MalGene: automatic extraction of malware analysis evasion signature. In: CCS 2015, pp. 769–780. ACM (2015)Google Scholar
  8. 8.
    Kumar, A.V., Vishnani, K., Kumar, K.V.: Split personality malware detection and defeating in popular virtual machines. In: SIN 2012, pp. 20–26. ACM (2012)Google Scholar
  9. 9.
    Oyama, Y.: Trends of anti-analysis operations of malwares observed in API call logs. J. Comput. Virol. Hacking Tech., 1–17 (2017).
  10. 10.
    Pék, G., Bencsáth, B., Buttyán, L.: nEther: in-guest detection of out-of-the-guest malware analyzers. In: EUROSEC 2011, p. 3:1–3:6. ACM (2011)Google Scholar
  11. 11.
    Raffetseder, T., Kruegel, C., Kirda, E.: Detecting system emulators. In: Garay, J.A., Lenstra, A.K., Mambo, M., Peralta, R. (eds.) ISC 2007. LNCS, vol. 4779, pp. 1–18. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-75496-1_1 CrossRefGoogle Scholar
  12. 12.
    Rodríguez, R.J., Rodríguez-Gastón, I., Alonso, J.: Towards the detection of isolation-aware malware. EEE Lat. Am. Trans. 14(2), 1024–1036 (2016)CrossRefGoogle Scholar
  13. 13.
    Shi, H., Alwabel, A., Mirkovic, J.: Cardinal pill testing of system virtual machines. In: Proceedings of the 23rd USENIX Security Symposium, pp. 271–285 (2014)Google Scholar
  14. 14.
    Symantec: ISTR - Internet Security Threat Report. Technical report (2016)Google Scholar
  15. 15.
    Tan, J.W.J., Yap, R.H.C.: Detecting malware through anti-analysis signals - a preliminary study. In: Foresti, S., Persiano, G. (eds.) CANS 2016. LNCS, vol. 10052, pp. 542–551. Springer, Cham (2016). doi: 10.1007/978-3-319-48965-0_33 CrossRefGoogle Scholar
  16. 16.
    Wang, G., Estrada, Z.J., Pham, C., Kalbarczyk, Z., Iyer, R.K.: Hypervisor introspection: a technique for evading passive virtual machine monitoring. In: Proceedings of the 9th USENIX Workshop on Offensive Technologies. USENIX Association (2015)Google Scholar
  17. 17.
    Yoshioka, K., Hosobuchi, Y., Orii, T., Matsumoto, T.: Your sandbox is blinded: impact of decoy injection to public malware analysis systems. J. Inf. Process. 19, 153–168 (2011)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Álvaro Botas
    • 1
    Email author
  • Ricardo J. Rodríguez
    • 2
  • Vicente Matellán
    • 1
  • Juan F. García
    • 1
  1. 1.Instituto de Ciencias Aplicadas a la CiberseguridadUniversidad de LeónLeónSpain
  2. 2.Centro Universitario de la DefensaAcademia General MilitarZaragozaSpain

Personalised recommendations