Hello, I am back! If Brad left to fill up his drink, we can pretend I went home to take a nap and came back to the party.
Before I discuss my thoughts on your last post, let me congratulate my party partner for his appearance in Business Insider’s Top 100 People in AI for 2023! Anyone who reads KRCP can understand why Brad has a place on this list – again, huge congrats!
Okay, now for the good stuff. The takeaways from the Economist article are helpful and not much too surprising from this piece. I was intrigued by your first impressions about the willingness of Ukraine or Russia to use lethal autonomous systems. I’ve been thinking through the constraints Ukraine may indeed be facing, particularly in the context of employing lethal autonomy. However, they are facing far greater constraints in domestic politics (in the US and Europe) than with any kind of technological capability.
There have been recent claims that Ukraine is using ML in attacks (we’ll get into that) and it is worth considering what kind of backlash has already come from this. Not even necessarily by the US, but certainly from European partners and supporters. Let’s consider three things – the current technology, operational environments, and political support.
In October, Forbes reported that Ukraine is using Saker Scout drones (made in Ukraine) which use AI to “identify and attack 64 types of Russian ‘military objects’ [including tanks, personnel carriers, and other hardware] on their own, operating in areas where radio jamming blocks communication and prevents other drones from working.” Saker Scout drones can carry three kilos of bombs to a range of roughly 12 km. According to Forbes, the system is “updated on demand when there is a requirement to detect a specific new object or vehicle type.”
The Saker Scout drones have the capability to carry out strikes autonomously, and according to the Forbes report, these types of strikes have already occurred by Ukraine in fully autonomous mode, targeting Russian hardware on a small scale – most likely when radio interference or jamming prevents direct communication with an operator. The baseline capabilities are not necessarily new – but there are a few things to talk through here.
My initial instinct was that the US would have had a stronger response to these types of attacks – and, indeed, perhaps they have behind closed doors. But publicly, there is no response. There could be multiple reasons for that. For one, reporting regarding these attacks is from early October, and we all know that the US attention shifted to hostilities in Gaza. Second, according to Forbes reporting, the drones used in autonomous mode have only been for small-scale attacks. Of course, there is no information beyond that, but perhaps these systems were used in conditions that did not raise any (or many) eyebrows.
In the abstract, it is not difficult to imagine AI-enabled autonomous systems carrying out bigger attacks in the future. From a legal perspective, there are many questions about the ’64 military objects’ that are constantly updated to include more objects. This certainly demonstrates the real-time concerns that many lawyers have about how legal standards or concepts are incorporated into AI-enabled autonomous systems. I would think a big US response would happen if the systems suddenly started conducting indiscriminate attacks after one of these continuous updates. Perhaps more on that in another post to put some of the legal academic thought into concrete examples.
Also, the operating environment matters a lot to the future use of these systems. In nearly every event I have participated in (including the recent event in Newport – more on that later), there are significantly fewer controversies or concerns using these systems in sparsely populated spaces (sea, desert…) than in an urban environment. Since we have minimal information about where the Saker drones were used autonomously, we can’t untangle that web quite yet. But I would assume most here will agree, that the where will matter for future use and create less of a stir from Western sponsors to Ukraine.
As we have all seen, Ukraine is struggling to generate more support these days, which is far more dependent on domestic politics and heightened attention to hostilities in Gaza. But as Congress sorts out the future of aid to Ukraine, a $52 billion support plan for Ukraine was blocked in the EU by Hungary. Nonetheless, as of mid-November, the DOD spent 97% of funds allocated by Congress for assistance to Ukraine. It may well be that aid will soon end or be extremely limited. Other watchdogs (human rights groups, for example) may be far more concerned with the use of autonomous weapons than anyone else at this point.
Certainly, I am not nearly as much of an insider as you or many of our readers. But given the enormous discourse devoted to AI, lethal autonomous weapons, and the future of warfare, it is striking that many of these capabilities (especially ML) are being used on a battlefield and the world responds with a shrug. Killer robot fatigue? Short attention span? Greater attention on Israel and Hamas? Not enough reporting?
Who knows. But certainly, forums like this are always helpful, perhaps necessary, to continue to track the technological, political, and operational evolution of these capabilities. Perhaps the global comfort levels with lethal autonomous weapons will adjust not with an overnight shift and the sudden reality of these systems, which is somehow what I imagined. Rather, a slow and steady introduction with very little public indignation, and even a general shrug.
Well said. Two comments:
- AI-assisted capabilities have, as I understand it, greatly accelerated target development. At least for longer-range strikes. Yet the limitation has been lack of ammunition more than anything else. And at least from all I have heard, not much core AI work happening with tactical drones at the front lines.
- AI aside, I listened to two fascinating presentations by senior Ukrainian officials at last week's Association of Old Crows annual conference. As many of us suspected would be the case with the "small drone wars," what started as a distinct tactical advantage for Ukraine is becoming less of one now that Russia is catching up. If wars show us anything, it is the inevitability of "cat and mouse" developments on the battlefield. Advantage, counter, counter-counter, counter-counter-counter etc.
Yet in today's digital environment, the speed of adaptation and agility matter enormously. And that's where Ukraine has and will maintain a distinct advantage. Of course, they need resources. Even at a few hundred bucks apiece, maintaining a 10,000+ drone force needs sustained funding.
AI far away?
It’s shrugs without feelings of fear.
Nearby opens our eyes.