Welcome to Our

Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Electronic parts at Twist Robotics. (Kasia Strek/Panos Pictures for The Washington Post) The acceleration of drone technology has worried security experts given the growing number of non-state actors that have used UAVs for lethal purposes, including Hezbollah in Lebanon, the Houthis in Yemen, the Islamic State in Iraq and Syria, and Mexico-based drug cartels. But while the cost of building an airplane-size drone like an MQ-9 Reaper is beyond the capabilities of such groups, obtaining and utilizing AI-assisted drone software is not. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” said Paul Scharre, a drone expert at the Center for a New American Security and the author of the book “Four Battlegrounds: Power in the Age of Artificial Intelligence.” “It’s really easy for non-state actors to go online, obtain the software and repurpose it.” Major military powers have long grappled with the ethics of allowing machines to use lethal force in combat. President Biden’s top military adviser, Gen. Mark A. Milley, has said the United States requires that “humans” remain in the “decision-making loop,” and recently called on other major militaries to adopt the same standards. The new targeting technology still requires the human operator to select the target, said Kovalchuk, whose drone company also uses the AI software. But once the selection happens, the drone pursues the target and releases the munition — resulting in a gap between the human decision and the lethal act. Ukrainians who have tested the new software insist that the machine’s role is limited and “acceptable,” Kovalchuk said. “We’re not targeting civilians,” he said. “And we consider a mistake of five to 10 meters acceptable.” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival.