Abstract
Organisations, governments and users are increasingly vulnerable to cyberattacks, which effects may range from threatening the foundation of modern information society, to causing catastrophic failure of nation-wide critical infrastructure. As a result, a great deal of effort has been given to mitigating the risks stemming from cyber-attacks and has established the field of cyber-security. To defend against cyber-attacks, defenders are required to foresee threat actors’ actions some way in advance to implement protective means. This interdependency places anticipatory practice at the heart of cyber-security. Despite the importance of these activities, little work exists that considers the details of anticipatory practices in cyber-security.
This chapter explores how a forward-looking stance and the use of that forward-looking stance to affect a change in the present are a key concept and goal of cyber-security. Literature on anticipation in cyber-security is reviewed and its importance illustrated through the use of a detailed case study. In this study, we draw upon empirical accounts of five cyber-threat analysts’ day-to-day practices to explore how futures are envisioned and used in the attempt to protect cyber-space.
We find that anticipation of the future is, under different names, a well-considered stance in cyber-security that attracts attention from practitioners and theorists alike. The practices that were uncovered proved to be highly anticipatory in nature. Defenders take an “attack attitude”, where they aim to envision possible future attack behaviours to inform their defence responses. Analysts engage in external and internal knowledge acquisition activities to obtain knowledge about the attack and defence space and anticipate the future. Yet, obtaining this knowledge presents a major challenge due to the ambiguity and amount of information available. We conclude that improving cyber-defenders’ anticipatory capabilities may enhance the overall sense-making process and improve decision-making.
This is a preview of subscription content, log in via an institution.
References
Abraham, S., & Chengalur-Smith, I. (2010). An overview of social engineering malware: Trends, tactics, and implications. Technology in Society, 32(3), 183–196. doi:10.1016/j.techsoc.2010.07.001.
Albanese, M., Jajodia, S., Pugliese, A., & Subrahmanian, V. S. (2011). Scalable detection of cyber attacks. In Computer Information Systems–Analysis and Technologies, pp. 9–18. Retrieved from http://link.springer.com/chapter/10.1007/978-3-642-27245-5_4
Barthélemy, J., Bisdorff, R., & Coppin, G. (2002). Human centered processes and decision support systems. European Journal of Operational Research, 136(2), 233–252. Retrieved from http://www.sciencedirect.com/science/article/pii/S0377221701001126.
Beaver, J. M., Steed, C. A., Patton, R. M., Cui, X., & Schultz, M. (2011). Visualization techniques for computer network defense. SPIE Defense, Security, and Sensing, 801906–801906. Retrieved from http://reviews.spiedigitallibrary.org/data/Conferences/SPIEP/61777/801906_1.pdf.
Beznosov, K., & Beznosova, O. (2007). On the imbalance of the security problem space and its expected consequences. Information Management & Computer Security, 15(5), 420–431. doi:10.1108/09685220710831152.
Bhattacharyya, D., & Alisherov, F. A. (2009). Penetration testing for hire. International Journal of Advanced Science and Technology, 8, 1–8.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.
Button, G., & Dourish, P. (1996). Technomethodology: Paradoxes and possibilities. Proceedings of the SIGCHI conference on human factors in computing systems, pp. 19–26. Retrieved from http://dl.acm.org/citation.cfm?id=238394
D’Amico, A., & Salas, S. (2003). Visualization as an aid for assessing the mission impact of information security breaches. DARPA Information Survivability Conference and Exposition, Vol. 2, pp. 190–195. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1194964
de Bruijne, M. M., & van den Berg, J. J. (2014). A theory driven research project to determine what collaboration design best supports the sharing of pragmatic cyber security related information between organisations. Systems Engineering, Policy Analysis and Management (SEPAM). Retrieved from http://link.springer.com/chapter/10.1007/978-1-4615-5725-8_1
Dhillon, G., & Backhouse, J. (2001). Current directions in IS security research: Towards socio-organizational perspectives. Information Systems Journal, 11(2), 127–153. doi:10.1046/j.13652575.2001.00099.x.
Dourish, P. (2006). Implications for design. Proceedings of the SIGCHI conference on human factors in computing systems – CHI ’06, 541. doi: 10.1145/1124772.1124855.
Endsley, M. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors: The Journal of the Human Factors and Ergonomics Society, 37(1), 32–64.
Erbacher, R. F. (2012). 101 Visualization design for immediate high-level situational assessment. In Proceedings of the ninth international symposium on visualization for cyber security – VizSec ’12, 17–24. doi:10.1145/2379690.2379693.
Evans, D., Nguyen-Tuong, A., & Knight, J. (2011). Effectiveness of moving target defenses. Moving target defense: An asymmetric approach to cyber security, 81–100. doi: 10.1007/978-1-4614-0977-9.
Forrest, S., Somayaji, A., & Ackley, D. H. (1997). Building diverse computer systems. In Operating Systems, 1997, The sixth workshop on hot topics (pp. 67–72). IEEE.
Franke, U., & Brynielsson, J. (2014). Cyber situational awareness – A systematic review of the literature. Computers & Security, 46, 18–31. doi:10.1016/j.cose.2014.06.008.
Future of Life. (2015). 2015 Project grants recommended for funding. Retrieved 15 Jan 2017, from https://futureoflife.org/first-ai-grant-recipients/#Rubinstein
Gasson, S. (2003). Human-centered vs. user-centered approaches to information system design. Journal of Information Technology Theory and Application, 5(2), 29–46.
Grudin, J. (1990). The computer reaches out: The historical continuity of interface design. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 261–268). ACM.
Hadnagy, C. (2010). Social engineering: The art of human hacking. In The Art of Human Hacking. Indianapolis: Wiley. doi:10.1093/cid/cir583.
Hutchins, E. M., Cloppert, M. J., & Amin, R. M. (2011). Intelligence-driven computer network defense informed by analysis of adversary campaigns and intrusion kill chains. 6th Annual International Conference on Information Warfare and Security, (July 2005), 1–14.
Jonker, D., Langevin, S., Schretlen, P., & Canfield, C. (2012). Agile visual analytics for banking cyber “big data.” 2012 I.E. conference on visual analytics science and technology (VAST), pp. 299–300. doi:10.1109/VAST.2012.6400507.
Kim, W., Jeong, O.-R., Kim, C., & So, J. (2011). The dark side of the Internet: Attacks, costs and responses. Information Systems, 36(3), 675–705. doi:10.1016/j.is.2010.11.003.
Li, J., Ou, X., & Rajagopalan, R. (2010). Uncertainty and risk management in cyber situational awareness. Cyber Situational Awareness, 51–68. doi:10.1007/978-1-4419-0140-8_4.
Liang, G., Weller, S. R., Zhao, J., Luo, F., & Dong, Z. Y. (2016). The 2015 Ukraine blackout: Implications for false data injection attacks. IEEE Transactions on Power Systems, 8950(c), 1–1. doi:10.1109/TPWRS.2016.2631891.
Michel, M. C. K., Helmick, N. P., & Mayron, L. M. (2011). Cognitive cyber situational awareness using virtual worlds. In 2011 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support, CogSIMA 2011 (pp. 179–182). https://doi.org/10.1109/COGSIMA.2011.5753440
Miller, C. (2007). The legitimate vulnerability market: Inside the secretive world of 0-day exploit sales. Independent Security Evaluators, 1–10. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.139.5718
Norman, D. A. (2002). The design of everyday things, Human factors and ergonomics in manufacturing (Vol. 16). New York: Basic books. doi:10.1002/hfm.20127.
Nurse, J., Creese, S., & Goldsmith, M. (2012). An initial usability evaluation of the secure situation awareness system. Iscramlive.org, 1–5 Apr. Retrieved from http://www.iscramlive.org/ISCRAM2012/proceedings/240.pdf
O’Hare, S., Noel, S., & Prole, K. (2008). A graph-theoretic visualization approach to network risk analysis. Visualization for Computer Security, 60–67. Retrieved from http://link.springer.com/chapter/10.1007/978-3-540-85933-8_6.
Office, C. (2010). A strong Britain in an age of uncertainty: The national security strategy. Norwich: The Stationery Office.
Parkin, S., van Moorsel, A., Inglesant, P., & Sasse, M. (2010). A stealth approach to usable security: Helping IT security managers to identify workable security solutions. Methodology, 33–49. doi:10.1145/1900546.1900553.
Payne, B., & Edwards, W. (2008). A bief introduction to usable security. Internet Computing, IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4510876
Phenomenon Institute. (2015). The importance of cyber threat intelligence to a strong security posture. Retrieved from https://www.webroot.com/shared/pdf/CyberThreatIntelligenceReport2015.pdf
Rescorla, E. (2005). Is finding security holes a good idea? IEEE Security & Privacy, 3(1), 14–19.
Rowe, N. C., Custy, E. J., & Duong, B. T. (2007). Defending cyberspace with fake honeypots. Journal of Computers (Finland), 2(2), 25–36. doi:10.4304/jcp.2.2.25-36.
Sarter, N. B., Woods, D. D., & Billings, C. E. (1997). Automation Surprises. In Handbook of human factors and ergonomics (2nd ed., Vol. 2, pp. 1926–1943). Hoboken: Wiley. https://doi.org/10.1207/s15327108ijap0204
Sasse, M., Brostoff, S., & Weirich, D. (2001). Transforming the “weakest link” – A human/computer interaction approach to usable and effective security. BT Technology Journal. Retrieved from http://link.springer.com/article/10.1023/A:1011902718709
Savage, S. (2005). Internet outbreaks: Epidemiology and defenses. In Keynote address, internet society symp. Network and distributed system security (NDSS 05). Retrieved from http://cseweb.ucsd.edu/users/savage/papers/InternetOutbreak.NDSS05.pdf
Stotz, A., & Sudit, M. (2007). Information fusion engine for real-time decision-making (INFERD): A perceptual system for cyber attack tracking. 2007 10th International Conference on Information Fusion, pp. 1–8. doi:10.1109/ICIF.2007.4408113.
Streilein, W., Truelove, J., Meiners, C. R., & Eakman, G. (2011). Cyber situational awareness through operational streaming analysist. In Military communications conference, 2011 (Vol. 298, pp. 1152–1157). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6127455
Suchman, L. A. (1987). Plans and situated actions: The problem of human-machine communication. Cambridge: Cambridge University Press.
Twycross, J., & Williamson, M. M. (2003). Implementing and Testing a Virus Throttle. In Proceedings of the 11th {USENIX} Security Symposium (pp. 285–294). Washington: The USENIX Association. Retrieved from http://www.usenix.org/events/sec03/tech/full_papers/twycross/twycross.pdf
Tygar, J. D. (2011). Adversarial machine learning. IEEE Internet Computing, 15(5), 4–6. doi:10.1109/MIC.2011.112.
Usable, P. (2004). In search of usable security: Five lessons from the field. IEEE Security & Privacy, 19–24. Retrieved from http://people.cs.vt.edu/~kafura/cs6204/Readings/Usability/UsableSecurityFiveLessons.pdf.
von Solms, R., & van Niekerk, J. (2013). From information security to cyber security. Computers & Security, 38, 97–102. doi:10.1016/j.cose.2013.04.004.
Wang, P. A., & Nyshadham, E. (2011). Knowledge of online security risks and consumer decision making: An experimental study. In System Sciences (HICSS), 2011 44th Hawaii International Conference on (pp. 1–10). IEEE.
Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the Process of Sensemaking. Organization Science, 16(4), 409–421. doi:10.1287/orsc.1050.0133.
Werlinger, R., Hawkey, K., Botta, D., & Beznosov, K. (2009). Security practitioners in context: Their activities and interactions with other stakeholders within organizations. International Journal of Human-Computer Studies, 67(7), 584–606. Retrieved from http://www.sciencedirect.com/science/article/pii/S1071581909000354.
Williams, F. C. B., Faithfull, W. J., & Roberts, J. C. (2012). SitaVis - Interactive situation awareness visualization of large datasets. In IEEE Conference on Visual Analytics Science and Technology 2012, VAST 2012 - Proceedings (Vol. 5, pp. 273–274). https://doi.org/10.1109/VAST.2012.6400520
Winograd, T., & Woods, D. D. (1997). The challenge of human-centered design. In Human-Centered Systems: Information, Interactivity, and Intelligence (pp. 17–19). Washington DC: National Science Foundation.
Wu, Q., Ferebee, D., Lin, Y., & Dasgupta, D. (2009). Visualization of security events using an efficient correlation technique. 2009 I.E. Symposium on computational intelligence in cyber security, Vol. 978, pp. 61–68. doi:10.1109/CICYBS.2009.4925091.
Acknowledgments
This work was funded through a sponsorship of an EPSRC Industrial CASE Award number OUCL/2013/JMA.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this entry
Cite this entry
Ahrend, J.M., Jirotka, M. (2017). Anticipation in Cyber-Security. In: Poli, R. (eds) Handbook of Anticipation. Springer, Cham. https://doi.org/10.1007/978-3-319-31737-3_26-1
Download citation
DOI: https://doi.org/10.1007/978-3-319-31737-3_26-1
Received:
Accepted:
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-31737-3
Online ISBN: 978-3-319-31737-3
eBook Packages: Springer Reference Religion and PhilosophyReference Module Humanities and Social SciencesReference Module Humanities