Notes
Explementary for a chaotic environment is the ongoing war in Ukraine, where numerous factors influence the (perception of) war. From actors like proxies, mercenaries, state actors to multidomain military fighting, with imposed financial/military sanctions by the EU to Russia, social media platforms spreading (fake) content, UN, and NATO involvement, to daily podcasts from local Ukrainian fighters on events.
References
Adler, P. (2001). Market, hierarchy, and trust; the knowledge economy and the future of capitalism. Organization Science.
Bray, W., & Moore, D. (2021). The Department of the Navy’s commitment to Big Data, AI, and machine learning. Maryland: Naval Institute Press Annapolis.
Dekker, H. (2004). Control of inter-organizational relationships; evidence on appropriation concerns and coordination requirements. Accounting, Organizations and Society.
Del Monte, L. (2019). Genius Weapons, Artificial Intelligence, Autonomous Weaponry, and the future of Warfare. New York: Prometheus Books.
Economics and Social Commission for Asia and the Pacific (ESCAP) (2009). What is Good Governance? United Nations. Consulted on 2 November 2022: https://www.unescap.org/sites/default/files/good-governance
Gray, M., & Ertan, A. (2021). Artificial Intelligence and Autonomy in the Military: An Overview of NATO Member States. Tallinn: NATO Cooperative Cyber Defence Centre of Excellence
Grandori, A. (2000). Conjectures for a New Research Agenda on Governance.Journal of Management and Governance.
Grandori, A. (2006). Innovation, uncertainty and relational governance. Industry and Innovation.
High-Level Expert Group on Artificial Intelligence (2019). Ethics Guidelines For Trustworthy AI Consulted on 11 May 2019: https://ec.europa.eu/futurium/en/ai-alliance-consultation/guidelines#Top
Gulati, R., & Singh, H. (1998). The architecture of cooperation; managing cooperation costs and appropriation concerns in strategic alliances. Administrative Science Quarterly.
Kale, P., Singh, H., & Perlmutter, H. (2000). Learning and protection of proprietary assets in strategic alliances; building relational capital.Strategic Management Journal.
McChrystal, S., et al. (2015). Team of teams, new rules of engagement in a complex world. USA: Penguin Group.
Provan, K., & Kenis, P. (2008). Modes of network governance.Journal of Public administration Research and Theory.
Scharre, P. (2018). Army of None: Autonomous weapons and the future of war. New York: W.W. Norton & Company
Scharre, P. (2021). The Navy at a crossroads, the uneven adoption of Autonomous Systems in the Navy. Maryland: Naval Institute Press Annapolis.
Sullivan, P., & The Oceanit Team. (2021). Theory and conceptual history of Artificial Intelligence. Maryland: Naval Institute Press Annapolis.
Swift, S., & Siordia, A. (2021). Mission Command and the speed of decision, what Big Data, Artificial Intelligence, and machine learning should do for the Navy. Maryland: Naval Institute Press Annapolis.
Tomkins, C. (2001). Interdependencies, trust and information in relationships, alliances and networks. Accounting, Organizations and Society.
Uzzi, B. (1997). Social structure and competition in interfirm networks; the paradox of embeddedness. Administrative Science Quaterly.
Verdiesen, I. (2019). The design of human oversight in autonomous weapon systems International Joint Conferences on Artificial Intelligence. Consulted 10 October 2022: http://resolver.tudelft.nl/uuid:01bb2cf4-38f2-4aa1-ac79-fe13325207eb
Verdiesen, I., Santoni de Sio, F., & Dignum, V. (2020). Accountability and Control over Autonomous Weapon. Systems: A Framework for Comprehensive Human Oversight Minds & Machines. Consulted on 1 November 2022: https://doi.org/10.1007/s11023-020-09532-9
Yeung, K. (2019). Responsibility and AI: a study of the implications of advanced digital technologies (including AI systems) for the concept of responsibility within a human rights framework, commissioned report DGI. Birmingham Law School.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Polderman, L. Governing (ir)responsibilities for future military AI systems. Ethics Inf Technol 25, 18 (2023). https://doi.org/10.1007/s10676-023-09694-x
Published:
DOI: https://doi.org/10.1007/s10676-023-09694-x