Formation of Awareness

Chapter
Part of the Advances in Information Security book series (ADIS, volume 62)

Abstract

Having discussed the importance and key features of CSA, both in general and in comparison with a better known Kinetic Situational Awareness, we now proceed to explore how and from where the CSA emerges. Formation of Cyber Situational Awareness is a complex process that goes through a number of distinct phases and produces a number of distinct outputs. Humans with widely different roles drive this process while using diverse procedures and computerized tools. This chapter explores how situational awareness forms within the different phases of the cyber defense process, and describes the different roles that are involved in the lifecycle of situational awareness. The chapter presents an overview of the overall process of cyber defense and then identifies several distinct facets of situational awareness in the context of cyber defense. An overview of the state of the art is followed by a detailed description of a comprehensive framework for Cyber Situational Awareness developed by the authors of this chapter. We highlight the significance of five key functions within CSA: learning from attacks, prioritization, metrics, continuous diagnostics and mitigation, and automation.

References

  1. Albanese, M., Jajodia, S., Pugliese, A., and Subrahmanian, V. S. “Scalable Analysis of Attack Scenarios”. In Proceedings of the 16th European Symposium on Research in Computer Security (ESORICS 2011), pages 416-433, Leuven, Belgium, September 12-14, 2011.Google Scholar
  2. Albanese, M., Jajodia, S., and Noel, S. “Time-Efficient and Cost-Effective Network Hardening Using Attack Graphs”. In Proceedings of the 42nd Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN 2012), Boston, Massachusetts, USA, June 25-28, 2012Google Scholar
  3. Albanese, M., Jajodia, S., Singhal, A., and Wang, L. “An Efficient Approach to Assessing the Risk of Zero-Day Vulnerabilities”. In Proceedings of the 10th International Conference on Security and Cryptography, Reykjavík, Iceland, July 29-31, 2013. Best paper awardGoogle Scholar
  4. Albanese, M., Pugliese, A., and Subrahmanian, V. S. “Fast Activity Detection: Indexing for Temporal Stochastic Automaton based Activity Models”. In IEEE Transactions on Knowledge and Data Engineering, vol. 25, no. 2, pages 360-373, February 2013.Google Scholar
  5. Albanese, M., Molinaro, C., Persia, F., Picariello, A., and Subrahmanian, V. S. “Discovering the Top-k "Unexplained" Sequences in Time-Stamped Observation Data”. IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 3, pages 577-594, March 2014.Google Scholar
  6. Ammann, P., Wijesekera, D., and Kaushik, S. “Scalable, graph-based network vulnerability analysis,” in Proceedings of the 9th ACM Conference on Computer and Communications Security (CCS 2002), pp. 217–224, Washington, DC, USA, November 2002.Google Scholar
  7. Cloud Security Alliance (CSA). “Cloud Controls Matrix Version 3.0”, https://cloudsecurityalliance.org/research/ccm/
  8. Cousins, D., Partridge, C., Bongiovanni, K., Jackson, A. W., Krishnan, R., Saxena, T., and Strayer, W. T. “Understanding Encrypted Networks Through Signal and Systems Analysis of Traffic Timing”, 2003.Google Scholar
  9. Gardner, H. “The Mind’s New Science: A History of the Cognitive Revolution”, Basic Books, 1987.Google Scholar
  10. Jajodia, S., Liu, P., Swarup, V., and Wang, C. (Eds.) “Cyber Situational Awareness: Issues and Research” , Vol. 46 of Advances in Information Security, Springer, 2010.Google Scholar
  11. Jajodia, S., Noel, S., Kalapa, P., Albanese, M., and Williams, J. “Cauldron: Mission-Centric Cyber Situational Awareness with Defense in Depth”. In Proceedings of the Military Communications Conference (MILCOM 2011), Baltimore, Maryland, USA, November 7-10, 2011.Google Scholar
  12. Johnson-Laird, P. “How We Reason”, Oxford University Press, 2006.Google Scholar
  13. Leversage, D. J., Byres, E. J. “Estimating a System's Mean Time-to-Compromise,” IEEE Security & Privacy, vol. 6, no. 1, pp. 52-60, January-February 2008.Google Scholar
  14. Mandiant, “APT1: Exposing One of China’s Cyber Espionage Units”, 2013Google Scholar
  15. MITRE. “Common Vulnerabilities and Exposures (CVE)”, http://cve.mitre.org/.
  16. NIST. “National Vulnerability Database (NVD)”, http://nvd.nist.gov/.
  17. NIST. “Guide for Applying the Risk Management Framework to Federal Information Systems”, Special Publication 800-37, Revision 1, http://dx.doi.org/10.6028/NIST.SP.800-37r1, February 2010.Google Scholar
  18. NIST. “Security and Privacy Controls for Federal Information Systems and Organizations”, Special Publication 800-53, Revision 4, http://dx.doi.org/10.6028/NIST.SP.800-53r4, April 2013.Google Scholar
  19. Partridge, C., Cousins, D., Jackson, A.W., Krishnan, R., Saxena, T., and Strayer, W. T. “Using signal processing to analyze wireless data traffic”, In Proceedings of the 1st ACM workshop on Wireless Security (WiSE 2002), ACM, pages 67-76, 2002.Google Scholar
  20. Phillips, C., and Swiler, L. P. “A graph-based system for network-vulnerability analysis,” in Proceedings of the New Security Paradigms Workshop (NSPW 1998), pp. 71–79, Charlottesville, VA, USA, September 1998.Google Scholar
  21. Symantec Corporation. “Internet Security Threat Report 2014”, Volume 19, April 2014.Google Scholar
  22. Wang, L., Liu, A., and Jajodia, S. “Using attack graphs for correlating, hypothesizing, and predicting intrusion alerts,” Computer Communications, vol. 29, no. 15, pp. 2917–2933, September 2006.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.George Mason UniversityFairfaxUSA

Personalised recommendations