Military planners are also pushing for greater autonomy for drones and other unmanned systems. Some are even arguing that the autonomous systems themselves will be better at making the decision about when and where to fire weapons than humans.
New Left Project writes about the inhuman future of how we engage in warfare. Will the decision whether to conduct military strikes eventually be determined by algorithm?
One night last summer Shakeel Khan and his family were at home in North Waziristan when there was a huge explosion. ‘I was resting with my parents in one room when it happened. God saved my parents and I, but my brother, his wife, and children were all killed.’ The children were five and three years old. Khan says, ‘I must support my aged parents now but I earn very little We don’t have enough to reconstruct our house and fear that the drones will strike us again.’ Shakeel Khan and his family, living in a remote part of Pakistan, had become victims of one of the newest weapons on the planet: unmanned drones.
Unmanned Aerial Vehicles (UAVs), commonly known as drones, are aircraft which can be piloted remotely using wireless technology. Most military drones are fairly small and used over distances of a few kilometres for intelligence and reconnaissance purposes. Over the past few years however we have witnessed the rapid rise of the use of armed drones such as the Predator and the Reaper. They are used by US and British forces to launch attacks at great distances while the operators sit safely in air-conditioned trailers over 7,000 miles away in a base near Las Vegas. It is not just the US and the UK. Israel too has used armed drones to launch attacks in Gaza (indeed it’s not too much of an exaggeration to say that the drone wars were conceived in the occupation); Italy has been using drones over Libya; and it is estimated that over forty other countries are now trying to buy or develop their own drones. A recent defense market report predicted that annual global spending on drones would double to $11.5 billion over the next few years.
While supporters of unmanned systems like drones argue they are in effect the same as piloted aircraft, others are beginning to see that by removing one of the key restraints to warfare – the risk to one’s own forces – unmanned systems seem to be making armed attacks much more likely. This year alone US forces have been engaged in armed conflicts in six separate countries (Iraq, Afghanistan, Libya, Yemen, Somalia and Pakistan), something that many military experts are saying would not be possible without the use of drones.
Drones are also clearly eroding legal and human rights. In Pakistan and Yemen, for example, the CIA are using drones to undertake assassinations of individuals placed on a so-called ‘high value target’ list. The hi-profile targeted killing of US-born cleric Anwar al-Alwaki this weekend is just the latest example. Human rights organisations such as Amnesty International and Human Rights Watch have condemned such killing and the UN Special Rapporteur on extra-judicial killing has repeatedly called on the US to explain how they justify using drones to target and kill individuals under current international law. Although the UK has never officially confirmed that it operates a High Value Target list, it has hinted that it does and a recent presentation at an MoD conference in Cardiff told the delegates to assume ‘that a HVT list was agreed and maintained.’
As well as targeted killing, drones are being used to deliver what the military call ‘persistent presence’. Without on-board pilots, drones are able to loiter over a particular area for hours, days, and even weeks, enabling weapons operators and intelligence analysts back at base to scrutinise particular areas looking for suspicious behavior and ‘targets of opportunity’. It is suggested that this way of using drones is leading to high civilian casualty rates, and it is perhaps easy to see why. With young servicemen and women subjected to long hours of boredom whilst in control of lethal technology, mistakes are bound to be made. While the secrecy surrounding the circumstances of drone strikes makes these allegations difficult to prove, a NATO enquiry into attack on a convoy in Afghanistan in February 2010, in which 23 civilians were killed, reported that drone operators had ‘downplayed’ the presence of civilians as they wanted an attack to go ahead.
Earlier this year the British MoD admitted for the first time that it had killed Afghan civilians in a drone strike in March 2011. Responding to a parliamentary question from Green Party MP Caroline Lucas about the deaths David Cameron said, ‘I do not think that the answer is to turn our face away…’ Unfortunately the Prime Minister was not suggesting that we should of course face up to our responsibilities to these innocent victims, but was, rather predictably arguing that we cannot turn away from drone technology which, as he put it, is ‘taking out’ the bad guys.
In the US too, officials argue that drones are efficiently and carefully taking out only the ‘bad guys’. John Brennan, a former member of the CIA and currently a senior counter terrorism advisor to President Obama, has been mocked in the mainstream press for arguing in response to questions about drone strikes in Pakistan that ‘…for the past year there hasn’t been a single collateral death because of the exceptional proficiency, precision of the capabilities that we’ve been able to develop’. Those claims were shown to be patently untrue by the excellent research work on drone strikes in Pakistan by the Bureau of Investigative Journalism. The BIJ working with local researchers, journalists and lawyers in Pakistan have uncovered hundreds of civilian deaths including those of more than 160 children, in drone strikes in Pakistan over the past seven years.
Military planners are also pushing for greater autonomy for drones and other unmanned systems. Some are even arguing that the autonomous systems themselves will be better at making the decision about when and where to fire weapons than humans. Gordon Johnson, formerly of the Joint Forces Command at the Pentagon, for example, commenting on the growth of robotic systems suggested that, ‘they don’t get hungry. They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes.’ While the technology for fully autonomous systems utilising artificial intelligence is still some way off military planners are pushing ahead with exploring the underlying technology that will increase autonomy of drones.
Many in government and the military see drones as the ‘perfect weapons for a war weary nation on a tight budget,’ as one journalist recently put it. They are much cheaper than traditional piloted aircraft, they enable the military to undertake armed attacks at little or no risk and their use is much easier to keep secret. However, both the military and the drone industry recognise that a major obstacle to the continued use of drones is the ‘public perception’ of drones. They recognize, as they put it, that the public has a ‘natural scepticism’ about the use of unmanned drones. It’s no coincidence then that we have seen a steady rise in stories about how drones and other unmanned systems could potentially be used for good purposes such as search and rescue, wildlife monitoring and environmental surveys; the idea of the ‘killer drone’ needs to be overcome.