Journal of Hardware and Systems Security

, Volume 2, Issue 4, pp 333–344 | Cite as

Detecting Hardware Trojans Inserted by Untrusted Foundry Using Physical Inspection and Advanced Image Processing

  • Nidish VashisthaEmail author
  • M. Tanjidur Rahman
  • Haoting Shen
  • Damon L. Woodard
  • Navid Asadizanjani
  • Mark Tehranipoor


Hardware Trojans are malicious changes to the design of integrated circuits (ICs) at different stages of the design and fabrication process. Different approaches have been developed to detect Trojans namely non-destructive and destructive testing. However, none of the previously developed methods can be used to detect all types of Trojans as they suffer from a number of disadvantages such as low speed of detection, low accuracy, low confidence level, and poor coverage of Trojan types. Majority of the hardware Trojans implemented in an IC will leave a footprint at the active layer. In this paper, we propose a new technique based on rapid backside SEM imaging and advanced computer vision algorithms to detect any subtle changes at the active region of transistors that can show the existence of a hardware Trojan. Here, we are only concerned with untrusted foundry problem, where it is assumed the attacker has access to a golden layout/image of the IC. This is a common threat model for those organizations that fully design their IC but need access to untrusted foundry for fabrication. SEM image from a backside thinned golden IC is compared with a low-quality SEM image of an IC under authentication (IUA). We perform image processing to both golden IC and IUA images to remove noise. We have developed a computer vision-based framework to detect hardware Trojans based on their structural similarity. The results demonstrate that our technique is quite effective at detecting Trojans and significantly faster than full chip reverse engineering. One of the major advantages of our technique is that it does not rely on the functionality of the circuit, rather the real physical structure to detect malicious changes performed by the untrusted foundry.


Hardware trust Trojan detection Image processing Reverse engineering Scanning electron microscopy 


  1. 1.
    Our technologies bridge the gap in the micron scale.
  2. 2.
    The navy bought fake Chinese microchips that could have disarmed U.S. missiles (2011).
  3. 3.
    What is the “AQL” (acceptance quality limit) in simple terms? (2011).
  4. 4.
    Semi industry fab costs limit industry growth (2012).
  5. 5.
    Samsung breaks ground on $14 billion fab (2015).
  6. 6.
    Acceptable quality limit (AQL) (2018).
  7. 7.
    Aarestad J, Acharyya D, Rad R, Plusquellic J (2010) Detecting Trojans through leakage current analysis using multiple supply pad. IEEE Trans Inf Forensics Secur 5(4):893–904CrossRefGoogle Scholar
  8. 8.
    Adee S (2008) The hunt for the kill switch–are chip makers building electronic trapdoors in key military hardware? The Pentagon is making its biggest effort yet to find out. IEEE SpectrumGoogle Scholar
  9. 9.
    Agarwal M, Paul BC, Zhang M, Mitra S (2007) Circuit failure prediction and its application to transistor aging. In: VLSI test symposium, 2007. 25th IEEE. IEEE, pp 277–286Google Scholar
  10. 10.
    Aitken R (2006) DFM metrics for standard cells. In: Proceedings of the 7th international symposium on quality electronic design. IEEE Computer Society, pp 491–496Google Scholar
  11. 11.
    Bao C, Forte D, Srivastava A (2014) On application of one-class SVM to reverse engineering-based hardware trojan detection. In: 2014 15th international symposium on quality electronic design (ISQED). IEEE, pp 47–54Google Scholar
  12. 12.
    Becker GT, Regazzoni F, Paar C, Burleson WP (2013) Stealthy dopant-level hardware trojans. In: International Workshop on Cryptographic Hardware and Embedded Systems, Springer, pp 197–214Google Scholar
  13. 13.
    Bhasin S, Regazzoni F (2015) A survey on hardware Trojan detection techniques. In: 2015 IEEE international symposium on circuits and systems (ISCAS). IEEE, pp 2021–2024Google Scholar
  14. 14.
    Bhunia S, Hsiao MS, Banga M, Narasimhan S (2014) Hardware trojan attacks: threat analysis and countermeasures. Proc IEEE 102(8):1229–1247CrossRefGoogle Scholar
  15. 15.
    Chakraborty RS, Wolff F, Paul S, Papachristou C, Bhunia S (2009) Mero: a statistical approach for hardware Trojan detection. In: Cryptographic hardware and embedded systems-CHES 2009. Springer, pp 396–410Google Scholar
  16. 16.
    Courbon F, Loubet-Moundi P, Fournier JJ, Tria A (2015) A high efficiency hardware Trojan detection technique based on fast SEM imaging. In: Proceedings of the 2015 design, automation & test in Europe conference & exhibition. EDA Consortium, pp 788–793Google Scholar
  17. 17.
    Courbon F, Loubet-Moundi P, Fournier JJ, Tria A (2015) Semba: a SEM based acquisition technique for fast invasive hardware Trojan detection. In: 2015 European conference on circuit theory and design (ECCTD). IEEE, pp 1–4Google Scholar
  18. 18.
    Gonzalez RC, Woods RE (2018) Digital image processing PearsonGoogle Scholar
  19. 19.
    Jin Y, Makris Y (2008) Hardware Trojan detection using path delay fingerprint. In: 2008. HOST 2008. IEEE international workshop on hardware-oriented security and trust. IEEE, pp 51–57Google Scholar
  20. 20.
    Krig S (2016) Computer vision metrics. Springer, BerlinCrossRefGoogle Scholar
  21. 21.
    Narasimhan S, Wang X, Du D, Chakraborty RS, Bhunia S (2011) Tesr: a robust temporal self-referencing approach for hardware Trojan detection. In: 2011 IEEE international symposium on hardware-oriented security and trust (HOST). IEEE, pp 71–74Google Scholar
  22. 22.
    Quadir SE, Chen J, Forte D, Asadizanjani N, Shahbazmohamadi S, Wang L, Chandy J, Tehranipoor M (2016) A survey on chip to system reverse engineering. ACM Journal on Emerging Technologies in Computing Systems (JETC) 13(1):6Google Scholar
  23. 23.
    Rajendran J, Sam M, Sinanoglu O, Karri R (2013) Security analysis of integrated circuit camouflaging. In: Proceedings of the 2013 ACM SIGSAC conference on computer & communications security. ACM, pp 709–720Google Scholar
  24. 24.
    Salmani H, Tehranipoor M, Karri R (2013) On design vulnerability analysis and trust benchmarks development. In: 2013 IEEE 31st international conference on computer design (ICCD). IEEE, pp 471–474Google Scholar
  25. 25.
    Salmani H, Tehranipoor M, Plusquellic J (2012) A novel technique for improving hardware Trojan detection and reducing Trojan activation time. IEEE Trans Very Large Scale Integr VLSI Syst 20(1):112–125CrossRefGoogle Scholar
  26. 26.
    Shakya B, He T, Salmani H, Forte D, Bhunia S, Tehranipoor M (2017) Benchmarking of hardware Trojans and maliciously affected circuits. J Hardw Syst Secur 1(1):85–102CrossRefGoogle Scholar
  27. 27.
    Shiyanovskii Y, Wolff F, Rajendran A, Papachristou C, Weyer D, Clay W (2010) Process reliability based trojans through NBTI and HCI effects. In: 2010 NASA/ESA conference on adaptive hardware and systems (AHS). IEEE, pp 215–222Google Scholar
  28. 28.
    Tehranipoor M, Koushanfar F (2010) A survey of hardware Trojan taxonomy and detection. IEEE Des Test Comput 27(1):21Google Scholar
  29. 29.
    Tehranipoor M, Wang C (2011) Introduction to hardware security and trust. Springer Science & Business MediaGoogle Scholar
  30. 30.
    Torrance R, James D (2009) The state-of-the-art in IC reverse engineering. In: Cryptographic hardware and embedded systems-CHES 2009. Springer, pp 363–381Google Scholar
  31. 31.
    Wang X, Tehranipoor M, Plusquellic J (2008) Detecting malicious inclusions in secure hardware: challenges and solutions. In: 2008. HOST 2008. IEEE international workshop on hardware-oriented security and trust. IEEE, pp 15–19Google Scholar
  32. 32.
    Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612CrossRefGoogle Scholar
  33. 33.
    Wolff F, Papachristou C, Bhunia S, Chakraborty RS (2008) Towards Trojan-free trusted ICS: problem analysis and detection scheme. In: Proceedings of the conference on design, automation and test in Europe. ACM, pp 1362–1365Google Scholar
  34. 34.
    Xiao K, Forte D, Jin Y, Karri R, Bhunia S, Tehranipoor M (2016) Hardware trojans: lessons learned after one decade of research. ACM Transactions on Design Automation of Electronic Systems (TODAES) 22(1):6CrossRefGoogle Scholar
  35. 35.
    Yang K, Hicks M, Dong Q, Austin T, Sylvester D (2016) A2: analog malicious hardware. In: 2016 IEEE symposium on security and Privacy (SP). IEEE, pp 18–37Google Scholar
  36. 36.
    Zhang G, Das D, Xu R, Pecht M (2008) IDDQ trending as a precursor to semiconductor failure. In: 2008. PHM 2008. international conference on prognostics and health management. IEEE, pp 1–7Google Scholar
  37. 37.
    Zhang X, Tehranipoor M (2011) Case study: detecting hardware trojans in third-party digital ip cores. In: 2011 IEEE international symposium on hardware-oriented security and trust (HOST). IEEE, pp 67–70Google Scholar
  38. 38.
    Zhou B, Adato R, Zangeneh M, Yang T, Uyar A, Goldberg B, Unlu S, Joshi A (2015) Detecting hardware Trojans using backside optical imaging of embedded watermarks. In: Design automation conference (DAC), 2015 52nd ACM/EDAC/IEEE. IEEE, pp 1–6Google Scholar
  39. 39.
    Zitova B, Flusser J (2003) Image registration methods: a survey. Image Vision Comput 21(11):977–1000CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Florida Institute for Cyber Security (FICS) Research, Electrical and Computer Engineering DepartmentUniversity of FloridaGainesvilleUSA

Personalised recommendations