Last 10 News Items

Pages

Drone Warfare – From Wales To Gaza

Harry-Rogers-Wales

Conference speech from drone campaigner Harry Rogers of SeeingRed.com

25th May 2013 – Cardiff

Good afternoon comrades, friends and colleagues. Can I start by saying how pleased I am to see this conference taking place and can I take this opportunity to thank the organisers for inviting me to share the platform today with such distinguished friends. My name is Harry Rogers and I live in West Wales about 12 miles from Aberporth where the MOD have been carrying out their tests on the Watchkeeper Drone.

I am sixty five years old and I almost never got born because of a UAV. My mother and my aunt were in the back bedroom over the top of my grandfathers public house in West Croydon in June 1944 just ten seconds before one of Hitler’s V1 doodle bug flying bomb drones blew the back of the pub clean off. Ten seconds earlier and I would never have been born. My point here is that UAVs have been around a long time and the Watchkeeper is nothing really new in concept. A lot has been written about this piece of almost obsolete already Army surveillance equipment so I won’t add to the reams you can find on the internet.

I am a member of a local peace group called Bro Emlyn For Peace and Justice and we formed in 2003 as a response to the decision by Bush and Blair to attack Iraq. During the time we have been involved in a number of campaigns including the campaign against the development of Cardigan Airport as a testing ground for UAVs, also the proposed introduction of an unmanned aerial systems technology hub at Parc Aberporth and also the management buyout of the MOD missile testing base at Parcllyn by QinetiQ.

There is much I could talk to you about concerning the history of this campaigning but that is not why I agreed to come here today. The past is something that we can learn from but not something that we can undo. My speech today is primarily about the future of drone technology and why it is vital that we all start to pay attention to what the research and development bods at MIT and DARPA, QinetiQ and BAE systems are cooking up for the future. I make no apologies for basing most of this speech on a report made by the RAF in October 2012 about the future of Unmanned Aerial Systems to the Government.

I am interested in ensuring that we all go away from today with an understanding that the big issue we face in terms of UAV development is that of Autonomy. The next generation of drones may well be able to think for themselves and act autonomously, that is without any human in the loop as the Military say. That means flying machines that can make their own decisions in search and destroy operations based on a set of algorithmic decisions pre-determined by human masters who may or may not be benign in their intentions.

Sceptics amongst you are already muttering balderdash and hokum, science fantasy and other such epithets. Well virtually all of the rest of the speech will, I hope, convince you otherwise.

The RAF are worried about issues relating to the Geneva Convention with regard to autonomous UAV development and usage and they say that “compliance will become increasingly challenging as systems become more automated. In particular, if we wish to allow systems to make independent decisions without human intervention, some considerable work will be required to show how such systems will operate legally.”

Already there are automated weapons systems in use in Afghanistan, for example, the Phalanx and Counter-Rocket, Artillery and Mortar (C-RAM) systems used because there is deemed to be insufficient time for human response to counter incoming fire.

Future autonomous UAV systems will have to adhere to legal requirements and civilian airspace regulations. This will require political involvement in getting the necessary changes made. In my view it is absolutely vital that politicians understand the issues clearly and concisely because once made it will mean that there is going to be a lot of military and civilian hardware flying about in the airspace without any human interface whatsoever. The RAF say “As systems become increasingly automated, they will require decreasing human intervention between the issuing of mission-level orders and their execution.” and they go further saying “It would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority.” They also discuss the timescale for the introduction of increased autonomy via Artificial Intelligence, saying ;- ” Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years.” Their words, not mine.

Currently the MOD “has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective.”

The RAF are clearly worried about the direction all this is going in and they say “As technology matures and new capabilities appear, policy-makers will need to be aware of the potential legal issues and take advice at a very early stage of any new system’s procurement cycle.” I believe this highlights a degree of paranoia on the part of the RAF vis a vis it’s own future role.

Not only are the RAF exercised about legal dilemmas. Ethics and morals related questions also are in their thoughts such as when, where and how automated and autonomous unmanned systems may be used. This applies not just to the use of drones of course but also to all other forms of weaponry in any environment. Will all future wars be fought remotely with little or no loss of friendly or military personnel? Will future conflicts be waged between increasingly complex unmanned systems?

In my view a problem that we face today is that the accountants have control of governments and the most expensive resource used in public services is human beings. So autonomy offers massive savings in manpower and support for that manpower both before and after conflict occurs. As artificial Intelligence comes on board we are likely to see more complicated tasks arising that are beyond the capability of humans to deal with due to speed, complexity and information overload. No doubt some of you are probably suffering that now, but I only have a bit more to say before I will take questions, so bear with me.

The RAF and many others in the field are grappling with issues such as whether it is possible to develop AI that has the capability to focus on the unique (at the moment) ability that a human being has to bring empathy and morality to complex decision-making. The RAF say “To a robotic system, a school bus and a tank are the same – merely algorithms in a programme – and the engagement of a target is a singular action; the robot has no sense of ends, ways and means, no need to know why it is engaging a target. There is no recourse to human judgement in an engagement, no sense of a higher purpose on which to make decisions, and no ability to imagine (and therefore take responsibility for) repercussions of action taken.”

So we need to pose the following questions to our politicians:-

The RAF pose very important questions when they ask the following:-

Finally we must expect to see governments bringing in changes to the law of armed conflict in order to accommodate the use of autonomous UAS and we must shout our opposition to this from the rooftops.

So there we have it friends, there is a lot for us to consider when we look into the issue of drones. We have to view all of this in an holistic way. That is we must not just lumber along from one demo to the next thinking only about the impact of the current use of drone warfare, terrible though that is. It is absolutely vital that we start to consider what future implications there are for the maintenance and development of basic human rights. We must also consider what the use of technology means in terms of social control measures such as we see the beginnings of in Gaza and elsewhere. I believe we are at, if not all ready passed, a dangerous turning point in the way we occupy this planet.

It is incumbent on us all to make a fuss about these issues if we want a planet where Human Rights are protected. If we don’t then we condemn the world to a dystopian future where all kinds of as yet un-thought of technology is used to maintain the position of a global elite above and beyond that of the majority of the people. The choice is ours, make a ruckus or bury our head in the sand and wait for the worst possible sci-fi future to engulf us all.

Thanks for listening, I’m happy to take questions and take part in debate.

Harry Rogers 24/05/2013