The Association of the United States Army annual conference is a place where defense contractors and technology firms show off novel technology for ground warfare. This year, arguably the most notable item on display is a joint product of Ghost Robotics and SWORD International, called the Special Purpose Unmanned Rifle, or SPUR.
In clearer terms, SPUR is a robot dog with an assault rifle backpack: very literally the subject of a particularly creepy episode of Black Mirror.
To be clear, putting guns on small robotic ground vehicles is hardly new. More than a decade ago, the U.S. Army acquired three small, tracked TALON robots upgraded to carry a range of standard infantry weapons (confusingly, the armed version was dubbed SWORDS, though without relation to SWORD International). However, they were apparently never used in combat. A number of other similar programs – in the U.S. and elsewhere – have come and gone in the intervening years without yet producing a meaningful combat capability. On that basis, it is entirely possible that SPUR will remain a trade-show oddity and never transform the battlefield of the future.
Nevertheless, there are two issues here worth unpacking: one of form and one of autonomy.
A few years ago, the webcomic XKCD addressed the form question brilliantly. The fevered science-fictional imagination of 1984’s “Terminator” – a nigh-indestructible robot assassin in the shape of a human (or at least an Arnold Schwarzenegger), hunting its target without pity or fear or remorse – has given way to a reality in which robotic assassins look like windowless airplanes.
But, at least for those of us lucky enough to live outside the drone hunting grounds, the advent of flying robotic killers has not led to a sea change in our view of the use of armed force. After all, even Bernie Sanders would not rule out their use. In short, they have become rapidly normalized.
Perhaps, though, there is something different about the idea of an armed robot that at least superficially resembles a living creature rather than a tooled-up golf cart. Every new Boston Dynamics video of their robots demonstrating newly acquired, humanlike skills is met with protest and warning; by contrast, new capabilities demonstrated by conventional-looking uncrewed aircraft are largely met with silence and indifference.
The other fundamental issue is autonomy. The Army’s abortive SWORDS robots and their like were “remotely operated” rather than meaningfully autonomous. The human being controlling them had direct control over the unit’s functions: push the stick forward and it would move forward; push the fire button and the weapon would discharge. (The nature of remote control does mean that it would be possible to jam or hack the controls, but that is a separate problem.)
Something like SPUR, however – despite its name suggesting that it will serve merely as a distant appendage to a human infantry soldier – requires a great deal more autonomy to function.
Wheeled or tracked vehicles are limited in their capabilities – taking the stairs, for example, is difficult for them – but they do not require much additional processing than the operator provides in order to navigate. A legged platform, on the other hand, can theoretically navigate far more challenging terrain, but it requires a huge amount of hardware capability and on-the-fly data processing to avoid simply collapsing into an expensive heap. (This, incidentally, explains why legged robots have up until now largely remained the preserve of science fiction rather than reality.) Give a robot the ability to walk and, by definition, you are imbuing it with a level of autonomy an order of magnitude greater than a tracked or wheeled gun platform, even if a human still retains the final authority over firing its weapon.
Autonomy tends to be self-reinforcing. A system with the ability to navigate by itself, outside the reach of a human operator’s control unit, will quickly be deemed capable of more ambitious missions. Once it has demonstrated the ability to navigate autonomously, the ability to carry out mission directives and respond to changing circumstances will seem a small additional step. Every military still professes fealty to the idea of “meaningful human control” over the use of lethal force, but what that term means in practice is disputed – especially if more sophisticated electronic jamming and cyber capabilities make remote operation unreliable in time of need.
So while it is easy to dismiss discomfort with armed robot dogs as a relic of deeply ingrained cultural imagery of a putative robotic apocalypse, there are sound reasons to be worried about how, in this case, form might well drive function – right past the point of our own discomfort.