A robot dog enforces social distancing. A robot dog with a sniper rifle for a head jolts into life. A robot dog, armed and alert, walks the streets of a city near you; you are assured that not only is it safe, it is there for your safety.
The scenarios in the previous paragraph are the past (Singapore 2020, where Boston Dynamics’ Spot patrolled parks to police social distancing), the present (Washington DC, 2021, where Ghost Robotics’ Vision 60 quadrupedal robot was demonstrated last month with Sword International’s SPUR – “special purpose unmanned rifle” – built into its head), and the future (globally, with no sign of international agreement on banning autonomous weapons, armed robots will become increasingly common).
In Iron Man 2, the second of Marvel’s trilogy of Iron Man solo movies, the (partially) reformed arms-dealer-turned-superhero Tony Stark squares up against the less adept but even more morally flexible industrialist Justin Hammer. While Stark Industries attempts to pivot to more peaceful uses for its robotic technologies – while still offering lavish support to the US military (with whom Marvel also has a cosy real-world relationship) – Hammer’s company is far more willing to cover its robots in guns and sell them to the highest bidder.
The real world has an equivalent parallel: Boston Dynamics works with the military – its BigDog robot was designed to work like a pack mule for soldiers but never made it into the field as it was deemed too noisy; Spot appeared alongside French soldiers in exercises earlier this year – but Boston Dynamics currently forbids the use of its robots “as weapons or autonomous targeting systems”. GhostRobotics – the Hammer Industries of our reality – is obviously a lot more comfortable with its bots packing weapons.
Ghost’s CEO, Jiren Parikh, told IEEE Spectrum that Vision 60 is not autonomous – though it “does have perception for object avoidance” – and neither is the SPUR, which is a remotely-operated weapon. But the missing word there is “yet”. Vision 60 currently has an operator and for now the SPUR weapon is remotely triggered rather than able to choose and engage targets itself. Placing a weapon – Parikh prefers to refer to it with the bloodless defence industry term “payload” – onto a robot is a start point rather than an end in itself.
Another answer in Parikh’s IEEE Spectrum interview is rather more telling than his mollifying words about payloads and “operators in the loop”. He says:
“Decades ago, we had guided missiles, which are basically robots with weapons on them. People don’t consider it a robot, but that’s what it is. More recently, there have been drones and ground robots with weapons on them. But they didn’t have legs, and they’re not invoking this evolutionary memory of predators. And now add science fiction movies and social media to that, which we have no control over – the challenge for us is that legged robots are fascinating, and science fiction has made them scary… I think we’re going to have to socialise these kinds of legged systems over the next five to ten years in small steps, and hopefully people get used to them and understand the benefits for our soldiers. But we know it can be frightening. We also have families, and we think about these things as well.”
Isn’t it comforting when arms industry contractors come over all Sting-like (“… the Russians love their children too”) and remind you that they also have families. But the key point in that quote is the line about “[socialising] these kinds of legged systems over the next five to ten years.” Robot dogs are already on the streets of Singapore. It won’t be long before they are commonly used on the streets of the United States – Spot has already been tested by police departments in New York, Massachusetts, Boston and Honolulu – and inevitably make their way across the Atlantic to the UK.
Commenting on US law enforcement testing Spot, the American Civil Liberties Union said, “All too often, the deployment of these technologies happens faster than our social, political or legal systems react. We urgently need more transparency from government agencies, who should be upfront with the public about their plans to test and deploy new technologies.”
While interested parties like Parikh talk about “people [getting] used to” robots on the streets, these developments do not have to be inevitable. In the runup to the 2012 Olympics in London, I reported on the desire among UK police forces to have greater access to Unmanned Aerial Vehicles (“drones”) in time for the event and even suggestions that an aerial surveillance platform could be put permanently in the air above the capital. It didn’t happen.
Almost ten years on, there are – according to Freedom of Information requests – about 288 drones in use by British police forces. Most are small and their use is still relatively tightly controlled, despite senior officers often itching to get their hands on more tech and to use it more widely, jealous of their militarised compatriots in the US. Of course, those same leaders are excited by the prospect of robot dogs at their command, but their excitement need not lead to their desires being realised.
Having already cited the parallels in a fairly recent film, I’m going to stretch a little further back in movie history. In 1993’s original Jurassic Park, Jeff Goldblum’s character Ian Malcolm memorably castigates Richard Attenborough as the industrialist John Hammond: “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”
We can already build robots that we could arm to fight our battles, but the question of whether we should is still open. Our governments should not decide the answer without asking the rest of us what we think. Expecting us to “get used to” robot dogs roving the streets is not remotely good enough. There are many possible futures and we should fight to ensure the one we get is the one we’ve chosen – not simply one produced by the careful marketing of robots to keep us “safe”.
Mic Wright is a freelance writer and journalist based in London. He writes about technology, culture and politics