Abstract
Incident management systems have the potential to improve security dramatically but often experience problems stemming from organizational, interpersonal and social constraints that limit their effectiveness. These limits may cause underreporting of incidents, leading to erroneous perceptions of the actual safety and security situation of the organization. The true security situation may be better understood and underreporting may be reduced if underlying systemic issues surrounding security incident management are taken into account. A dynamic simulation, based on the parallel experience of industrial incident management systems, illustrates the cumulative effects of rewards, learning, and retributions on the fate of a hypothetical knowledge management system designed to collect information about events and incidents. Simulation studies are part of an ongoing research project to develop sustainable knowledge and knowledge transfer tools that support the development of a security culture.
Similar content being viewed by others
References
Anderson, R. (Ed.) (2001). Why information security is hard—an economic perspective. 17th Annual Computer Security Applications Conference, 2001 (ACSAC 2001); 10–14 December.
Anderson, D. J., & Webster, C. S. (2001). A system approach to the reduction of medication error on the hospital ward. Journal of Advanced Nursing, 35(1), 34–41.
Barach, P., & Small, S. D. (2000). Reporting and preventing medical mishaps: Lessons from nonmedical near miss reporting systems. British Medical Journal, 320, 759–763.
Barlas, Y. (1989). Multiple tests for validation of system dynamics type of simulation models. European Journal of Operations Research, 42, 59–87.
Barlas, Y. (1996). Formal aspects of model validity and validation in system dynamics. System Dynamics Review, 12(3), 183–210.
Bhatt, G. D. (2001). Knowledge management in organizations: Examining the interaction between technologies, techniques, and people. Journal of Knowledge Management, 5(1), 68–75.
Campbell, S. (2006).How to think about security failures. Communications of the ACM, 49(1), 37–39 (01).
Cooke, D. L., & Rohleder, T. R. (2006). Learning from incidents: From normal accidents to high reliability. System Dynamics Review, 22(3), 213–239.
Damodaran, L., & Olphert, W. (2000). Barriers and facilitators to the use of knowledge management systems. Behaviour & Information Technology, 19(6), 405–413.
Davenport, T. H. (1997). Information ecology: Mastering the information and knowledge environment. New York: Oxford University Press.
Davenport, T. H., & Prusak, L. (1998). Working knowledge: How organizations manage what they know. Boston: Harvard Business School Press.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340.
Debar, H., & Viinikka, J. (2005). Intrusion detection: Introduction to intrusion detection and security information management. In A. Aldini, R. Gorrieri, & F. Martinelli (Eds.), Foundations of security analysis and design III (pp. 207–236). Heidelberg: Springer Berlin.
Debowski, S. (2006). Knowledge management. Australia: John Wiley & Sons.
Forrester, J. W. (1961). Industrial dynamics. Cambridge MA: Productivity Press.
Forrester, J. W. (1994). Policies, decisions, and information sources for modeling. In J. D. W. Morecroft, J. D. Sterman (Eds.), Modeling for learning organizations (pp. 51–84). Portland, OR: Productivity Press.
Forrester, J. W., & Senge, P. M. (1981). Tests for building confidence in system dynamics models. TIMS Studies in the Management Sciences, 14, 209–228.
Gal-Or, E., & Ghose, A. (2005). The economic incentives for sharing security information. Information Systems Research, 16(2), 186–208.
Gold, A. H., Malhotra, A., & Segars, A. H. (2001). Knowledge management: An organizational capabilities perspective. Journal of Management Information Systems, 18(1), 185–214.
Gonzalez, J. J. (2005). Towards a cyber security reporting system—a quality improvement process. In B. A. G. Rune Winther, & G. Dahll (Eds.), Computer safety, reliability, and security. Heidelberg: Springer.
Gordon, L. A., & Loeb, M. P. (2002). The economics of information security investment. ACM Transactions on Information and System Security, 5(4), 438–457 (November).
Gordon, L. A., Loeb, M., & Lucyshyn, W. (2003). Sharing information on computer systems security: An economic analysis. Journal of Accounting Public Policy, 22(6), 461–485.
Gordon, L. A., Loeb, M. P., Lucyshyn, W., & Richardson, R. (2004). CSI/FBI Computer Crime and Security Survey. Technical Report: Computer Security Institute.
Görling, S. (2006). The myth of user education. Virus Bulletin Conference; 2006 11–13 October; Montréal, Canada.
Halal, G. (1998). The infinite resource. San Francisco: Jossey-Bass Publishers.
Haley, C. B., Moffett, J. D., Laney, R., & Nuseibeh, B. (Eds.) (2005). A framework for security requirements engineering. The 2006 International Workshop on Software Engineering for Secure Systems; Shanghai, China IEEE.
Holzapple, C. W., & Joshi, K. D. (2004). A formal knowledge management ontology: Conduct, activities, resources and influences. Journal of the American Society for Information Science and Technology, 55(7), 593–612 (May).
ISO/IEC, (2005). Information technology—Security techniques—Evaluation criteria for IT security—Part 1: Introduction and general model. Geneva, October. Standard ISO/IEC 15408-1:2005(E).
James, R. H. (2003). 1000 anaesthetic incidents: Experience to date. Anaesthesia, 58, 856–863.
Johnson, C. (2003). Failure in safety-critical systems: A handbook of incident and accident reporting. Glasgow, Scotland: Glasgow University Press.
Jones, S., Kirchsteiger, C., & Bjerke, W. (1999). The importance of near miss reporting to further improve safety performance. Journal of Loss Prevention in the Process Industries, 12(1), 59–67.
Kahneman, D. (1973). Attention and effort. Englewood Cliffs, NJ: Prentice-Hall.
Kahneman, D., & Tversky, A. (2000). Prospect theory: An analysis of decision under risk. In D. Kahneman & A. Tversky (Eds.), Choices, values, and frames. Cambridge, UK: Cambridge University Press.
Kjellén, U. (2000). Prevention of accidents through experience feedback. London and New York: Taylor & Francis.
Lee, P. I., & Weitzel, T. R. (2005). Air carrier safety and culture: An investigation of Taiwan’s adaptation to western incident reporting programs. Journal of Air Transportation, 10(1), 20–37.
March, J. G. (1994). A primer on decision-making: How decisions happen. New York: Free Press.
Moon, H. K., & Park, M. S. (2002). Effective reward systems for knowledge sharing. Knowledge Management Review, 22–25.
National Institute of Standards and Technology. (2000). An introduction to computer security: The NIST handbook. Special Publication 800-12: US Department of Commerce; July.
National Institute of Standards and Technology. (2001). Engineering principles for information technology security (A baseline for achieving security). Special Publication 800-27. Gaithersburg, MD: US Department of Commerce; 2001 June.
Nielsen, K. J., Carstensen, O., & Rasmussen, K. (2006). The prevention of occupational injuries in two industrial plants using an incident reporting scheme. Journal of Safety Research, 37(5), 479–486.
Nyssen, A. S., Aunac, S., Faymonville, M. E., & Lutte, I. (2004). Reporting systems in healthcare from a case-by-case experience to a general framework: An example in anaesthesia. European Journal of Anaesthesiology, 10(21), 757–765.
O’Dell, C., & Grayson, Jr. C. J. (1998). If only we knew what we know. New York: Free Press.
Phimister, J. R., Oktem, U., Kleindorfer, P. R., & Kunreuther, H. (2003). Near-miss incident management in the chemical process industry. Risk Analysis, 23(3), 445–459.
Randazzo, M. R., Keeney, M. M., Kowalski, E. F., Cappelli, D. M., & Moore, A. P. (2004). Insider threat study: Illicit cyber activity in the banking and finance sector. Technical Report. Pittsburgh, PA: U.S. Secret Service and CERT Coordination Center / Software Engineering Institute; 2004 August.
Reason, J. (2000). Human error models and management. British Medical Journal, (320), 768–770.
Repenning, N., & Sterman, J. (2001). Nobody ever gets credit for fixing defects that didn’t happen: Creating and sustaining process improvement. California Management Review, 43(4), 64–88.
Richardson, G. P., & Pugh, A. L. III. (1981). Introduction to system dynamics modeling with DYNAMO. Cambridge MA: Productivity Press.
Schneier, B. (2000). Secrets & lies: Digital security in a networked world. Wiley.
Stanhope, N., Crowley-Murphy, M., Vincent, C., O’Connor, A. M., & Taylor-Adams, S. E. (1999). An evaluation of adverse incident reporting. Journal of Evaluation in Clinical Practice, 5(1), 5–12.
Stanton, J. M., & Stam, K. R. (2006). The visible employee. Medford, MA: Information Today.
Sterman, J. D. (2000). Business dynamics: Systems thinking and modeling for a complex world. Boston: Irwin McGraw-Hill.
Stewart, T. A. (1997). Intellectual capital: The new wealth of organizations. New York: Doubleday.
Stoneburner, G. (2006). Toward a unified security/safety model. Computer, 39(8), 96–97.
Sveiby, K. E. (1997). The new organizational wealth: Managing and measuring knowledge-based assets. San Francisco: Berrett Koehler.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204.
Winkler, I. (2005). Spies among us: How to stop the spies, terrorists, hackers, and criminals you don’t even know you encounter every day. Indianapolis: Wiley.
Zangwill, W. I., & Kantor, P. B. (1998). Towards a theory of continuous improvement and the learning curve. Management Science, 44(7), 910–920.
Author information
Authors and Affiliations
Corresponding author
Additional information
When the majority of this work was undertaken, the lead author was at Agder University College (Now Agder University) in Norway. The lead author has subsequently moved to Tecnun, University of Navarra in Spain.
Rights and permissions
About this article
Cite this article
Sveen, F.O., Rich, E. & Jager, M. Overcoming organizational challenges to secure knowledge management. Inf Syst Front 9, 481–492 (2007). https://doi.org/10.1007/s10796-007-9052-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10796-007-9052-5