Abstract
For critical infrastructures (CI), technology solutions have been the preferred choice so far. Yet, the human component of CI could be the primary cause of events causing a less than resilient performance of a CI system. This chapter introduces a systemic approach that contextualizes cascading dynamics in the vulnerability of both technological and human elements. It is followed by a description of the evolution of critical infrastructure and management, envisioned as root causes of cascades effects, and explains the role of the human factor in that process. This chapter highlights why investments in technological resilience of cyber assets cannot do without the integration of its human component. Indeed, consensus is growing among security experts that the weakest link in the security chain is the human being, whether as users, customers, administrators, or managers. The technological progress needs to be followed step by step by improvements in users/operators’ skills and routines, adjusting their ability to improvise and resilience.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
An interesting comparison can be found in “Cataloging the World’s Cyberforces”, The Wall Street Journal, <http://graphics.wsj.com/world-catalogue-cyberwar-tools/>
- 2.
European Commission, DG Home Affairs, <http://ec.europa.eu/dgs/home-affairs/e-library/glossary/index_c_en.htm>
- 3.
See for example, The Editorial Board, “Public Works, Private benefits”, The New York Times, June 9, 2017, https://nyti.ms/2t0Ko2k
- 4.
Consequences of such state of affairs, however, begin to be noticed, see for example O. Jones,“Why Britainʼs Trains Donʼt Run on Time: Capitalism”, The New York Times, The Opinion Pages, April 4, 2017, <https://nyti.ms/2nS4B99>.
- 5.
To make the reactor’s core overheat is another story, of course. However, as Stuxnet demonstrated in 2010–2011, it is absolutely possible. What makes the difference, in this case, is the skills and resources of the attackers to develop something at the same level of Stuxnet.
- 6.
Several (albeit not all) of these hackers are reportedly from Eastern Europe and Russia.
- 7.
And if such users are smart enough to adopt even basic protections from script-kiddies and the average hacker as well
- 8.
S. Erlanger, D. Bilevsky & S. Chan, May, “Britain’s Health Service, Targeted in Cyberattack, Ignored Warnings For Months”, The New York Times, 13, 2017, Page A9
- 9.
It is also worth noting that updates increasingly burden a computer’s performance so that, at some point, it becomes inevitable to move up to a more powerful (and expensive) one. That’s a clear marketing strategy on the part of of the computing industry, but for computers that are usually operated for bookeeping or other minor cores spending more is not an option and so they are left “behind” as long as possible, but that makes them extremely vulnerable to such types of attacks.
References
Alexander, D. E. (2000). Confronting catastrophe. Oxford: Oxford University Press.
Alexander, D. E. (2016). How to write an emergency plan. London: Dunedin Academic Press.
Ansell, C., Boin, A., & Keller, A. (2010). Managing transboundary crises: Identifying the building blocks of an effective response system. Journal of Contingencies & Crisis Management, 18(4), 195–207.
Cambridge Centre for Risk Studies. (2016). Integrated infrastructure: Cyber resilience in society. Cambridge: University of Cambridge. Available at: cambridgeriskframework.com
Cohen, F. (2010). What makes critical infrastructures critical. International Journal of Critical Infrastructure Protection, 3, 53–54.
de Brujine, M., & van Eeten, M. (2007). Systems that should have failed: Critical infrastructure protection in an institutionally fragmented environment. Journal of Contingencies & Crisis Management, 15(1), 18–29.
Deutscher, S. A., Bohmayr, W., & Asen, A. (2017). Building a Cyberresilient Organization. Available at: www.bcgperspectives.com
Fritzon, A., et al. (2007). Protecting Europe’s critical infrastructures: Problems and prospects. Journal of Contingencies & Crisis Management, 15(1), 30–41.
Giacomello, G. (2004). Bangs for the Buck. Studies in Conflict & Terrorism, 27, 387–408.
Havlin, S., Kenett, D. Y., Ben-Jacob, E., Bunde, A., Cohen, R., Hermann, H., Kantelhardt, J. W., Kertesz, J., Kirkpatrick, S., Kurths, J., Portugali, J., & Solomon, S. (2012). Challenges in network science: Applications to infrastructures, climate, social systems and economics. European Physical Journal: Special Topics, 214(1), 273–293.
Helbing, D. (2013). Globally networked risks and how to respond. Nature Nature Publishing Group, 497(7447), 51–59.
Hellström, T. (2007). Critical infrastructure and systemic vulnerability: Towards a planning framework. Safety Science, 45(3), 415–430.
Kissel, R. (ed.) (2013). Glossary of key information security terms, NIST IR 7298 Revision 2, National Institute of Standards and Technology (NIST), US Department of Commerce, Available at: http://csrc.nist.gov/publications
Lazari, A. (2014). European critical infrastructure protection. London: Springer.
Linkov, I., Eisenberg, D. A., Plourde, K., Seager, T. P., Allen, J., & Kott, A. (2013). Resilience metrics for cyber systems. Environment Systems and Decisions, 33(4), 471–476.
Linkov, I., Bridges, T., Creutzig, F., Decker, J., Fox-Lent, C., Kröger, W., Lambert, J. H., Levermann, A., Montreuil, B., Nathwani, J., Nyer, R., Renn, O., Scharte, B., Scheffler, A., Schreurs, M., & Thiel-Clemen, T. (2014). Changing the resilience paradigm. Nature Climate Change. Nature Publishing Group, 4(6), 407–409.
Little, R. G. (2002). Controlling cascading failure: Understanding the vulnerabilities of interconnected infrastructures. Journal of Urban Technology, 9(1), 109–123.
Metzger, J. (2004). The concept of critical infrastructure protection. In A. Bailes & I. Frommelt (Eds.), Business and security public-private sector relationships in a new security environment (pp. 197–209). New York: Oxford University Press.
Miller, L., Antonio, R., & Bonanno, A. (2011). Hazards of neoliberalism: Delayed electric power restoration after hurricane Ike. The British Journal of Sociology, 62(3), 504–522.
Panguluri, S., Phillips, W., & Cusimano, J. (2011). Protecting water and wastewater infrastructure from cyber attacks. Frontiers of Earth Science, 5(4), 406–413.
Patel, S. C., Bhatt, G. D., & Graham, J. H. (2009). Improving the cyber security of SCADA communication networks. Communications of the ACM, 52(7), 139–142.
Perrow, C. (2011). Normal accidents: Living with high risk technologies [1984]. Princeton: Princeton University Press.
Pescaroli, G., & Alexander, D. E. (2015). A definition of cascading disasters and cascading effects: Going beyond the “toppling dominos” metaphor’, Planet@Risk. Global Forum Davos, 3(1), 58–67.
Pescaroli, G., & Alexander, D. E. (2016). Critical infrastructure, panarchies and the vulnerability paths of cascading disasters. Natural Hazards, 82(1), 175–192.
Portnoy, M., & Goodman, S. (Eds.). (2009). Global initiatives to secure cyberspace: An emerging landscape. New York: Springer.
Reason, J. (1990). Human error. Cambridge: Cambridge University Press.
Schulman, P. R., & Roe, E. (2007). Designing infrastructures : Dilemmas of design and the reliability of critical infrastructures. Journal of Contingencies & Crisis Management, 15(1), 42–49.
Sommer, P., & Brown, I. (2011). Reducing systemic cybersecurity risk, IFP/WKP/FGS(2011)3. Paris: Organisation for Economic Cooperation and Development.
Walker, J., Williams, B. J., & Skelton, G. W. (2010). Cyber security for emergency management, in IEEE international conference on technologies for Homeland Security (HST), (pp. 476–480).
Zanini, M., & Edwards, S. J. (2001). The networking of terror in the information age. In J. Arquilla & D. Ronfeldt (Eds.), Networks and Netwars: The future of terror, crime, and militancy. Rand Corporation pp. 29–60.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Giacomello, G., Pescaroli, G. (2019). Managing Human Factors. In: Kott, A., Linkov, I. (eds) Cyber Resilience of Systems and Networks. Risk, Systems and Decisions. Springer, Cham. https://doi.org/10.1007/978-3-319-77492-3_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-77492-3_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-77491-6
Online ISBN: 978-3-319-77492-3
eBook Packages: EngineeringEngineering (R0)