Abstract
We know from Chap. 4 that contemporaneous risk factors have an exponential effect on the magnitude of the likelihood component of risk. Furthermore, we learned in Chap. 7 that the presence of multiple risk factors introduces uncertainty in security risk management. The magnitude of uncertainty can be quantified if the individual risk factors are variables with finite variance. Intuitively, this source of uncertainty seems eminently reasonable; identifying, monitoring and addressing one risk factor is less difficult than two risk factors, which is less onerous than three, etc.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The italicized version of information is intended as a bit of foreshadowing. The reason for the italics will become apparent in the discussion on entropy.
- 2.
Piantadosi, S. T., Zipf’s word frequency in natural language; A critical review and future directions. Psychon Bull Rev. 2014 Oct; 21(5): 1112–1130.
- 3.
F.Reif, op. cit.
- 4.
Claude Shannon, 1916–2001, an American mathematician, electrical engineer and cryptographer.
- 5.
C. E. Shannon, The Mathematical Theory of Communication, Bell System Technical Journal, July and October 1948.
- 6.
Information entropy should not be confused with the entropy of statistical mechanics. The two concepts have a relationship, which can be informally summarized as “There is no such thing as a free lunch.” See J.R. Pierce, An Introduction to Information Theory; Symbols, Signals and Noise, Information Theory and Physics (Chap. 10), Dover, Second Edition, 1980.
- 7.
Thermodynamic entropy, often designated as S, refers to the number of states available to a system Ω, written as kln(Ω), where k is Boltzmann’s constant and ln is the natural logarithm.
- 8.
For those readers requiring a math refresher, a logarithm corresponds to an exponent. For example, the number 100 can be written as 10 ×10 = 102 where 10 is the base and 2 is the exponent. Therefore, the logarithm of 100 equals 2 in base 10. Analogously, the number 8 can be written as 2 × 2 × 2 = 23. Therefore, the logarithm of 8 equals 3 in base 2.
- 9.
Pierce, John R., An Introduction to Information Theory; Symbols, Signals and Noise, Dover, Second Edition, 1980.
- 10.
We use M to designate the number of risk factors in complexity threat scenarios only. Otherwise, the number of risk factors will be designated as R.
- 11.
H could be minized by either ensuring all risk factors were managed or unmanaged. Both conditions are equivalent from an information theoretic perspective. Recall it is the knowledge of the state of the risk factor in a threat scenario that affects information entropy and not the specific outcome or result. However, if one could ensure all risk factors were managed and this were known a priori it would surely represent an ideal security risk management condition.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Young, C.S. (2019). Threat Scenario Complexity. In: Risk and the Theory of Security Risk Assessment. Advanced Sciences and Technologies for Security Applications. Springer, Cham. https://doi.org/10.1007/978-3-030-30600-7_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-30600-7_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-30599-4
Online ISBN: 978-3-030-30600-7
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)