Abstract
Trust and autonomous systems, especially weapon systems, could be the most difficult technological challenge facing defense industries, militaries, politicians, and the public because the algorithms have to be trusted. Furthermore, the operator, the military, defense industry, politicians and the public need to trust the system to also follow ethical and legal rules. This paper briefly describes the trust considerations, concerns and constraints regarding autonomous weapons systems and concludes with a brief description of the current development programs and projects by the various US military services.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Trusting AI, About Us, IBM (no date)
NIST SP 800-161 Software Assurance in Acquisition: Mitigating Risks to the Enterprise, April 2015
Voas, J., Kuhn, R.: NIST Cybersecurity White Paper “Internet of Things (IoT) Trust Concerns”, 17 October 2018
Final Report National Security Commission on Artificial Intelligence, National Security Commission on Artificial Intelligence (NSCAI), March 2021
Mandeles, M.: The Future of War: Organizations as Weapons. Potomac Books, McLean (2005)
Risk Management Framework for Information Systems and Organizations: A System Life Cycle Approach for Security and Privacy, NIST Special Publication 800-37 Revision 2, December 2018
Protection of Mission Critical Functions to Achieve Trusted Systems and Networks (TSN), Department of Defense Instruction Number 5200.44 v3, 15 October 2018
Executive Order on Promoting the Use of Trustworthy Artificial Intelligence in the Federal Government, 3 December 2020
The National Artificial Intelligence Research and Development Strategic Plan: 2019 Update, Select Committee on Artificial Intelligence of the National Science & Technology Council, June 2019
Ethics Guidelines for Trustworthy AI, European Commission: High-Level Expert Group on Artificial Intelligence at 5, 8 April 2019
A Next Generation Artificial Intelligence Development Plan, 2017 (Chinese). DigiChina, New America, and the Stanford Cyber Policy Center, 02 March 2021
Governance Principles for a New Generation of Artificial Intelligence: Develop Responsible Artificial Intelligence (2017). (translated into English from Chinese)
Autonomy in Weapon Systems, DOD Directive 3000.09, 21 November 2012, Incorporating Change 1 on 8 May 2017
Stanton, B., Jensen, T.: Trust and Artificial Intelligence, NIST Interagency/Internal Report (NISTIR) - 8332, National Institute of Standards and Technology, Gaithersburg, MD (2021)
Eckstein, M.: Berger: Marines Need to Trust Unmanned, AI Tools for Future Warfare, USNI News, 2 February 2021
Unmanned Campaign Plan, Department of the Navy, 16 March 2021
CNO NAVPLAN, January 2021
Eckstein, M.: Navy to Expand Land-Based Testing for Unmanned Vessels, Conduct Offensive Firepower Analysis for USVs, 25 January 2021
LaGrone, S.: Navy Wants 10-Ship Unmanned ‘Ghost Fleet’ to Supplement Manned Force, USNI, 13 March 2019
Industrial Paper Airplanes for Autonomous Aerial Delivery, Press Release, Other Lab, 12 January 2017
Rohrlich, J.: The US Army wants to turn tanks into AI-powered killing machines, QZ News, 26 February 2019
Strout, N.: How the Army plans to revolutionize tanks with artificial intelligence, C4ISR Net, 29 October 2020
Air Force Science and Technology Strategy 2030, April 2019
Boeing, General Atomics, and Kratos to develop unmanned aircraft to demonstrate teaming with piloted planes, Intelligent Aerospace, 15 December 2020
Harper, J.: The Rise of Skyborg: Air Force Betting on New Robotic Wingman, National Defense Magazine, 25 September 2020
Trevithick, J.: Glitzy Air Force Video Lays Out “Skyborg” Artificial Intelligence Combat Drone Program, The Drive, 24 June 2020
Hitchens, T.: Air Force Research Lab’s Golden Horde Swarming Weapons, Defense Information, 22 January 2021
Host, P.: AFA Winter 2021: US Air Force envisions additional Vanguard S&T programmes in the future, Janes, 25 February 2021
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Lailari, G. (2022). Human and Machine Trust Considerations, Concerns and Constraints for Lethal Autonomous Weapon Systems (LAWS). In: Ahram, T., Taiar, R. (eds) Human Interaction, Emerging Technologies and Future Systems V. IHIET 2021. Lecture Notes in Networks and Systems, vol 319. Springer, Cham. https://doi.org/10.1007/978-3-030-85540-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-85540-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-85539-0
Online ISBN: 978-3-030-85540-6
eBook Packages: EngineeringEngineering (R0)