Safeguard would ensure that humans, not robots, control nuclear launch process
Washington (December 18, 2024) – Senator Edward J. Markey (D-Mass.) and Congressman Ted Lieu (CA-36), member of the House Foreign Affairs Committee, released the following statement after key provisions of their Block Nuclear Launch by Autonomous Artificial Intelligence Act were included in legislation to be signed by President Joe Biden. The provisions would safeguard the nuclear command and control process by preventing the use of artificial intelligence (AI) to make nuclear launch decisions.
“In this rapidly developing digital age, we need to ensure that humans alone hold the power to order the launch of nuclear weapons – not robots or computers. We are encouraged that key provisions of the Block Nuclear Launch by Autonomous Artificial Intelligence Act will become policy and will, therefore, allow us to safeguard human control over nuclear weapons. As we increasingly adopt AI technology into military planning and decisions, we must draw a firm line that prevents any chance of accidental nuclear war caused by AI. There is no question: humans must be in the loop of any nuclear launch.”
The Department of Defense’s 2022 Nuclear Posture Review states that current policy is to “maintain a human ‘in the loop’ for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment” in all cases. The Block Nuclear Launch by Autonomous AI Act would codify the Department’s existing policy. Furthermore, the National Security Commission on Artificial Intelligence, established by Congress through the FY19 National Defense Authorization Act, recommended in their final report that the U.S. clearly and publicly affirm its policy that only human beings can authorize employment of nuclear weapons. This bill follows through on their recommendation.
###