Skip to main content
Log in

Multi-Metrics Approach for Security, Privacy and Dependability in Embedded Systems

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

Embedded Systems have become highly interconnected devices, being the key elements of the Internet of Things. Their main function is to capture, store, manipulate and access data of a sensitive nature. Moreover, being connected to Internet, expose them to all kind of attacks, which could cause serious consequences. Traditionally, during the design process, security, privacy and dependability (SPD) have been set aside, including them as an add-on feature. This paper provides a methodology together with a Multi-Metrics approach to evaluate the system SPD level during both the design and running processes. The simplicity, based on a single process during the whole system evaluation, and scalability, simple and complex systems are evaluated equally, are the main advantages. The applicability of the presented methodology is demonstrated by the evaluation of a smart vehicle use case.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Alam, S., Chowdhury, M. M. R., & Noll, J. (2011). Interoperability of security-enabled internet of things. Wireless Personal Communications, 61(3), 567–586.

    Article  Google Scholar 

  2. new SHIELD (2014) nSHIELD. New embedded systems architecture for multi-layer dependable solutions. Retrieved September 9, 2014 http://www.newshield.eu

  3. Manadhata, P. K., & Wing, J. M. (2011a). An attack surface metric. Software Engineering, IEEE Transactions on, 37(3), 371–386.

    Article  Google Scholar 

  4. Voas, J., & Miller, K. W. (1995). Predicting software’s minimum-time-to-hazard andmean-time-to-hazard for rare input events. In Software Reliability Engineering, 1995. Proceedings, Sixth International Symposium on, IEEE (pp. 229–238).

  5. Voas, J., Ghosh, A., McGraw, G., Charron, F., & Miller, K. W. (1996). Defning an adaptive software security metric from a dynamic software failure tolerance measure. In Computer Assurance, 1996. COMPASS’96, Systems Integrity. Software Safety. Process Security. Proceedings of the Eleventh Annual Conference on, IEEE (pp. 250–263).

  6. Engler, D., Chelf, B., Chou, A., & Hallem, S. (2000). Checking system rules using system-specific, programmer-written compiler extensions. In Proceedings of the 4th conference on Symposium on Operating System Design & Implementation, USENIX Association (Vol. 4, pp. 1–16).

  7. Engler, D., Chen, D. Y., Hallem, S., Chou, A., & Chelf, B. (2001). Bugs as deviant behavior: A general approach to inferring errors in systems code. In Proceedings of the Eighteenth ACM Symposium on Operating Systems Principles, SOSP ’01, ACM, New York, NY, USA (pp. 57–72). doi:10.1145/502034.502041

  8. Wagner, D., Foster, J. S., Brewer, E. A., & Aiken, A. (2000). A first step towards automated detection of buffer overrun vulnerabilities. New York, NY: NDSS.

    Google Scholar 

  9. Zhang, X., Edwards, A., & Jaeger, T. (2002). Using cqual for static analysis of authorization hook placement. In USENIX Security Symposium, (pp. 33–48).

  10. United States Computer Emergency Readiness Team (US-CERT). (2014). National cyber awareness system. Retrieved September 9, 2014 https://www.us-cert.gov/ncas

  11. National Institute of Standards and Technology (NIST). (2014). National vulnerability database. Retrieved September 28, 2014 http://nvd.nist.gov

  12. MITRE. (2014). Common vulnerabilities and exposures. Retrieved September 28, 2014 http://www.cve.mitre.org

  13. SecurityFocus. (2014). Securityfocus. Retrieved September 28, 2014 http://www.securityfocus.com

  14. Browne, H. K., Arbaugh, W. A., McHugh, J., & Fithen, W. L. (2001). A trend analysis of exploitations. In Security and Privacy, 2001. S&P 2001. Proceedings. 2001 IEEE Symposium on, IEEE, (pp. 214–229).

  15. Alves-Foss, J., & Barbosa, S. (1995). Assessing computer security vulnerability. SIGOPS Operating Systems Review, 29(3), 3–13. doi:10.1145/206826.206829.

    Article  Google Scholar 

  16. Beattie, S., Arnold, S., Cowan, C., Wagle, P., Wright, C., & Shostack, A. (2002). Timing the application of security patches for optimal uptime. LISA, 2, 233–242.

    Google Scholar 

  17. Brocklehurst, S., Littlewood, B., Olovsson, T., & Jonsson, E. (1994). On measurement of operational security. Aerospace and Electronic Systems Magazine, IEEE, 9(10), 7–16.

    Article  Google Scholar 

  18. Howard, M. (2003). Fending off future attacks by reducing attack surface. Retrieved September 9, 2014 http://msdn.microsoft.com/en-us/library/ms972812.aspx

  19. Bartel, A., Klein, J., Le Traon, Y., & Monperrus, M. (2012). Automatically securing permission-based software by reducing the attack surface: An application to android. In Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering, ACM (pp. 274–277).

  20. Howard, M., Pincus, J., & Wing, J. M. (2005). Measuring relative attack surfaces. In D. T. Lee, S. P. Shieh, & J. D. Tygar (Eds.), Computer Security in the 21st Century (pp. 109–137). US: Springer.

    Chapter  Google Scholar 

  21. Kurmus, A., Sorniotti, A., & Kapitza, R. (2011). Attack surface reduction for commodity os kernels: Trimmed garden plants may attract less bugs. In Proceedings of the Fourth European Workshop on System Security, ACM, p. 6.

  22. Manadhata, P., & Wing, J. M. (2004). Measuring a system’s attack surface. Tech. rep., DTIC Document.

  23. Manadhata, P.K., & Wing, J.M. (2011). A formal model for a systems attack surface. In Moving target defense, chap creating asymmetric uncertainty for cyber threats, Vol. 54, (pp. 1–28). New York: Springer.

  24. Stuckman, J., & Purtilo, J. (2012). Comparing and applying attack surface metrics. In Proceedings of the 4th international workshop on Security measurements and metrics, ACM (pp. 3–6).

  25. Szefer, J., Keller, E., Lee, R. B., & Rexford, J. (2011). Eliminating the hypervisor attack surface for a more secure cloud. In Proceedings of the 18th ACM conference on Computer and communications security, ACM (pp. 401–412).

  26. Public Interest, Inc. (2014). Debian. Retrieved October 15, 2014 http://www.debian.org

  27. Red Hat, Inc. (2014). Redhat. Retrieved October 15, 2014 http://www.redhat.com

  28. Manadhata, P. K., & Wing, J. M. (2005). An attack surface metric. Tech. rep., DTIC Document.

  29. Manadhata, P. K., Tan, K. M., Maxion, R. A., & Wing, J. M. (2007). An approach to measuring a system’s attack surface. Tech. rep., DTIC Document.

  30. Manadhata, P., Wing, J., Flynn, M., & McQueen, M. (2006). Measuring the attack surfaces of two ftp daemons. In Proceedings of the 2nd ACM workshop on Quality of protection, ACM (pp. 3–10).

  31. Kurmus, A., Tartler, R., Dorneanu, D., Heinloth, B., Rothberg, V., Ruprecht, A., et al. (2013). Attack surface metrics and automated compile-time os kernel tailoring. In NDSS.

  32. Tartler, R., Kurmus, A., Ruprecht, A., Heinloth, B., Rothberg, V., Dorneanu, D., et al. (2012). Automatic os kernel tcb reduction by leveraging compile-time configurability. In Proceedings of the Eighth Workshop on Hot Topics in System Dependability, ser. HotDep, Vol. 12.

  33. Krumm, J. (2009). A survey of computational location privacy. Personal and Ubiquitous Computing, 13(6), 391–399.

    Article  Google Scholar 

  34. Shokri, R., Theodorakopoulos, G., Le Boudec, J. Y., & Hubaux, J. P. (2011). Quantifying location privacy. In Security and Privacy (SP), 2011 IEEE Symposium on, IEEE (pp. 247–262).

  35. Ma, Z., Kargl, F., & Weber, M. (2009). A location privacy metric for v2x communication systems. In Sarnoff Symposium, 2009. SARNOFF’09, IEEE (pp. 1–6).

  36. Jatain, A., & Mehta, Y. (2014). Metrics and models for software reliability: A systematic review. In Issues and Challenges in Intelligent Computing Techniques (ICICT), 2014 International Conference on, IEEE (pp. 210–214).

  37. Henkel, J., Bauer, L., Zhang, H., Rehman, S., & Shafique, M. (2014). Multi-layer dependability: From microarchitecture to application level. In Proceedings of the The 51st Annual Design Automation Conference on Design Automation Conference, ACM (pp. 1–6).

  38. Weiner, M., Jorgovanovic, M., Sahai, A., & Nikolie, B. (2014). Design of a low-latency, high-reliability wireless communication system for control applications. In Communications (ICC), 2014 IEEE International Conference on, IEEE (pp. 3829–3835).

Download references

Acknowledgments

The authors would like to thank their colleagues from the ARTEMIS project nSHIELD for the basics of the methodology, and the ongoing discussions on applicability. The work is financed in part by the JU ECSEL and the Research Council of Norway.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iñaki Garitano.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Garitano, I., Fayyad, S. & Noll, J. Multi-Metrics Approach for Security, Privacy and Dependability in Embedded Systems. Wireless Pers Commun 81, 1359–1376 (2015). https://doi.org/10.1007/s11277-015-2478-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-015-2478-z

Keywords

Navigation