Skip to content

AI Could be More Dangerous Than Nuclear Weapons

Photo: Mark Wilson
Photo: Mark Wilson (Getty Images)

The authors spend a great deal of time comparing AI’s potential destructive capabilities to that of nuclear weapons. It just so happens Kissinger had a front-row seat to witness and play a significant role, in the strategic geopolitical decision surrounding nuclear weapons. (Specifically how to prevent a playground full of empire-hungry superpowers from blowing each other to smithereens).

The author provides a brief history of the two main strategies used to avoid catastrophe: deterrence and disarmament. Fans of Kissinger will know the former hit a bit harder than the latter. Though these two strategies can seem at odds, the authors say they both share a similarity in that they both rely on the ability to calculate or predict what the other side is thinking. That logic disappears with AI, the authors warn.

“Most traditional military strategies and tactics have been based on the assumption of a human adversary whose conduct and decision-making calculus fit within a recognizable framework or have been defined by experience and conventional wisdom,” the authors write. “Yet an AI piloting an aircraft or scanning for targets follows its own logic, which may be inscrutable to an adversary and unsusceptible to traditional signals and feints—and which will, in most cases, proceed fasted than the speed of human thought.”

Though “uncertainty” is part and parcel of warfare, the author warns AI introduces a new dimension. What if countries aren’t even aware of their own AI capabilities? “Because AIs are dynamic and emergent, even those powers creating or wielding an AI-designed or AI operated weapon may not know exactly how powerful it is or exactly what it will do in a given situation,” the authors say.