Can the use of unmanned systems be regulated?

In the 20th century, three important science-based innovations led to significant technological progress, but also to new military options and new forms of warfare: nuclear energy, biotechnology and information and communication technologies (ICTs). The nuclear age, as well as the rise of biotechnology, created new ethical, legal, military and political challenges. The asymmetry between moral and technical expertise in the nuclear age was pointed out by General Omar Bradley in November 1948: “The world has achieved brilliance without wisdom, power without conscience. Ours is a world of nuclear giants and ethical infants. We know more about war than about peace, more about killing than we know about living” [1]. The application of ICTs also created new challenges and new dangers. The advancement of ICTs visibly led to new weapon systems and new kinds of warfare. The wars in Iraq (1991, 2003), Kosovo and Afghanistan (2001ff) demonstrated the emergence of a wide spectrum of new weapons such as unmanned systems (UMS), cruise missiles, satellite guided bombs and other precision-strike systems.

This paper first seeks to explore the conditions and driving forces of the current military-technical revolution. The second section outlines the characteristics, technology, proliferation and current use of unmanned systems. The last section examines the obstacles to and challenges for arms control and international law.

The Framework and Driving Forces of the current Revolution in Military Affairs

At the heart of the current RMA, a US-American term, is the exploitation of the revolutionary advances of the information age. The computational power and storage abilities of computers have been increasing by a factor of ten every five years. Moore´s observation, one of the basic laws of the digital age, notes that microprocessors and storage elements have doubled their performance every 18 months. The main elements of the information age are computers, fast global audio, video and data communication and the networking of many users. Laser and fiber optic communication, encryption technologies and data fusion allow rapid routing and processing of data. In addition, automatic pattern recognition techniques, improved radar systems and infrared sensors (for night vision or weather-independent surveillance) allow highly detailed imaging of geographical situations.

The key to future developments is not so much a new wave of innovation in military technologies, but rather the integration of diverse technologies into a “system of systems” and the permanent upgrading of this system via the constant modernization of its elements and connections. The US military is particularly enthusiastic about the ongoing RMA, which it believes “[…] will give us dominant battle space knowledge and the ability to take full military advantage of it" [2]. The emergence of new networked sensors, data processing capabilities and weapon platforms have led to new operational military concepts such as net-centric warfare or cyber operations. A “network-centric” system of systems consists of an observation (ISR) system, a communication system, a system of data processing and analysis, a strike system to deliver munitions with pinpoint accuracy and an evaluation system to ascertain the effectiveness of such attacks. The current military-technical developments utilize advanced ICT from the civilian sector, intensifying the dual-use dilemma [3].

The end of the East-West confrontation and the demise of the former Soviet Union have brought a shift toward a more unipolar system, with the US as the unchallenged sole global military superpower. Combined with major advances in science and technology, the main characteristics of US security policy are new military concepts and the will to use military power. There is a widespread belief that the United States’ military-technological advantage means that no antagonist can oppose US forces with conventional weapons. Consequently, current and future challenges for armed conflict are “asymmetric”, e.g. terrorism, low-intensity conflicts, possibly involving the use of unconventional weapons, such as biological or even nuclear weapons. Compared to low-tech weapons, such as small arms and light weapons, high-tech weaponry requires an industrial basis, is expensive to produce, difficult to use, has long research and development cycles, and is usually less prone to proliferation. Low-tech weaponry, on the other hand, is universally available, easy to use, and can proliferate rapidly.

Unmanned Systems: Not new but more striking

According to the Pentagon “unmanned vehicles” are powered vehicles that do “not carry a human operator, can be operated autonomously or remotely, can be expendable or recoverable, and can carry a lethal or nonlethal payload" [4]. Ballistic or cruise missiles, torpedoes, satellites or mines are not considered to be unmanned systems.

Unmanned air vehicles (UAVs) are reusable loitering systems primarily used today for intelligence, surveillance and reconnaissance (ISR) missions, but also for target acquisition, damage assessment and communication relay. The operators of such remote-controlled UAVs can sit several thousand kilometers away from the target without risking their lives. These UAVs are paradigmatic for the emerging net-centric warfare. They are generally unarmed systems, but some have been modified to carry weapons. They can fly autonomously or be piloted remotely at high or low altitude and are equipped to return home. They can be very large and heavy (Global Hawk) or very small in size, can use a range of propulsion systems and can transport different payloads (from a few to 250 kg). The various systems range in cost from a few thousand to tens of millions of US dollars. Important characteristics are endurance time, weight, range, ceiling, etc.

Unmanned vehicles are usually divided into three sub-categories depending on the environment in which they are moving: Unmanned air vehicles (UAV), unmanned ground vehicles (UGVs) and unmanned naval vessels which are subdivided into unmanned surface vehicles (USVs) and unmanned underwater vehicles (UUVs). The US Air Force calls UAVs remotely piloted aircraft (RPA) and is going to train more pilots to operate drones than fighter or bomber pilots.

After more than a decade of R&D, testing and deployment of UAVs the US has the lead in the full spectrum of UMS. Typical missions are ISR, target acquisition or explosive ordnance disposal. The US arsenal of drones has experienced unprecedented growth over the past decade. It is estimated that the US has 8,000 UAVs, most of which are unarmed. It is estimated that the US has around 12,000 ground UMS. High altitude/long endurance (HALE) UAVs such the GLOBAL HAWK (payload 1360 kg), have a flight time of 30 hours and provide all-weather performance packages on board for reconnaissance and target designation. Medium altitude/long endurance UAVs (MALE), such as the Predator or the Reaper, are armed and have precision strike tasks. Tactical UAVs, such as the HUNTER or HERMES 450, have a range between 125 and 250 km and operate at low altitudes (5,000m). Mini-UAVs, such as DESERT HAWK, are man-portable and hand-launched.  They are used for beyond-line-of-sight scouting with one hour flight time at a range of roughly 5 km. Hand-held  micro-UAVs, such as the WASP or g-MAV, are electrically powered for about one hour [5]. Arming an  UAV is an increasing trend leading to a new category: unmanned combat aerial vehicles.  In 2012, the USAF had 54 REAPERS and 161 PREDATORS in their arsenal. Today, it is estimated that the US has an arsenal of 7,454 unmanned platforms [6]. For 2012 the Pentagon has asked for $3.9 billion in procurement and development for UMS [7].

The Use of lethal drones

The use of armed drones by the United States is twofold (a) in regular military conflicts, such as Iraq and Afghanistan, under full control of the airspace and (b) as “extraterritorial killings” operated by the CIA. According to officials, drone strikes in Pakistan have killed more than 2,000 militants. The number of innocent civilian casualties is controversial. An NGO estimated that the CIA conducted 370 drone strikes in Pakistan in the decade of 2004-2013, killing 2,548-3,549 people, among them 411 to 890 civilians [8]. US-led drone warfare in remote parts of Pakistan, in Yemen and Somalia is mainly justified by the US “war against terrorism” and has significantly increased under President Obama, making drone warfare a centerpiece of his counterterrorism strategy. President G. W. Bush ordered fewer than 50 drone strikes during his term, whereas President Obama has overseen more than 400 of them in the last four years.

Proponents claim that the drone strikes have killed key leaders of terrorist groups and associated anti-American militant groups, thereby denying terrorists' sanctuaries in Pakistan, Yemen and Somalia. “And they have done so at little financial cost, at no risk to U.S. forces, and with fewer civilian casualties than many alternative methods would have caused" [9]. Opponents argue that drones have killed thousands of civilians and alienated allied countries by angering and traumatizing the public and can create “sworn enemies out of a sea of local insurgents” [10]. Audrey K. Cronin has concluded that: “The problem for Washington today is that its drone program has taken on a life of its own, to the point where tactics are driving strategy rather the other way around.”

There are many justified doubts about whether this new method of targeted killings will be efficient. Capturing a terrorist leader e.g. is much more efficient because it avoids creating new martyrs and helps to get access to the rationale, contacts and motivations of commanders of terror. The key problem with armed drone strikes is the intelligence needed to identify potential targets. This task is mostly left to secret services, which do not publish their sources, procedures and criteria for these kinds of targeting operations. These extrajudicial killings are seen by many in Europe as illegal and politically unwise.

Until now only the US, the UK and Israel have used armed drones in Lebanon against Hezbollah and in Gaza against Hamas, but it obvious that other countries will start imitating the use of lethal drones against people they have identified as violent insurgents or terror group combatants in areas of conflict, at their borders or on their own territories. The US, in particular, is obviously starting to create a new norm to strike preemptively against those who plan to attack outside or inside their territory.

Proliferation and Autonomy

The US has a huge advantage in numbers and capability of UAVs, but the qualities of drones, especially a capacity for surveillance and precise strikes and the fact that the operator might sit in safety thousands of kilometers from the target, is appealing for other countries too.

It is estimated that 80 countries possess drones and 50 countries have R&D programs. The technology has already become widespread. Not many countries are developing strategic armed drones with long ranges and precision-strike capability. According to the Teal Company, it is believed that global spending on R&D for UMS and procurement will total more than $94 billion over the next decade [11].

Other countries, such as Israel and China, are aggressively developing and promoting UAVs and countries, such as Russia, Iran, India and Pakistan, are also not far behind, creating the environment for a “drone arms race” [12]. At their air shows, Chinese companies have displayed different models of UAVs, among them types capable of attacking aircraft carriers and armored vehicles. Non-state actors can also acquire simple UAVs and might use them for attacks on persons or groups. In 2011 and 2012 some individuals were arrested in the USA and in Germany, charged with plotting to load an UAV with explosives and crash it into buildings.

In principle, there are three ways to acquire UAVs: (1) A state can simply purchase a military or civilian system legally or illegally from a producer. Such a system is not just one object, but includes a ground station and logistical support. (2) After buying an UAV package, a country can also try to modernize such a system or convert an existing manned aircraft to an unmanned system. (3) The third path is to develop an UAV indigenously by using components available on the world market. There is also growing concern that the proliferation of UAVs can pose a threat to the US and other countries [13]. UAVs (as well as cruise missiles) can also be used for the delivery of bioweapons. UAVs, which can carry heavy payloads (250-500 kg), can, in principle, also deliver nuclear weapons, although the primary delivery system would still be a bomber or a ballistic missile. D. Gormley adds that “the spread of these systems globally will affect US military dominance, regional stability and homeland defence" [14].

Further developments

The Pentagon´s Unmanned Systems Roadmap 2007-2032 describes future developments and projects to improve the performance (lightweight, precise delivery or lower-power), interoperability and the operational spectrum of UMSs. Other countries, such as the UK, Germany, France, Australia, Canada, Israel or South Korea, have different programs for robotics and UGV. The US and Japan have a human-robot- initiative (HRI) to develop future humanoid robotic technology. Future military development goals are to transform UAVs into joint unmanned combat aircraft systems for a wider spectrum of combat missions (Suppression of Enemy Air Defences (SEAD), strike, electronic attack etc.) with an improved data link and stealth capabilities (e.g. the planned Joint Unmanned Combat Air System (J-UCAS)). Other goals are to reduce weight, increase agility and integrate robotics. The current armed drones are remote-controlled, but some of them already have semi-autonomous functions, such as automatic take-off or landing. Surveillance, identifying, tracking, targeting and engaging are in the hands of the operator, but some or all of these functions might become more autonomous due to new developments in the field of microprocessors or mathematical algorithms.

On November 21, 2012, the Pentagon released a directive on “Autonomy in Weapon Systems” to establish guidelines and a national policy for the future development and use of autonomous and semi-autonomous functions in weapon systems. The directive is not a “moratorium”, but it says that “autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force" [15]. This weak criterion is interpretation-dependant and allows for “developing, testing, and using the technology, without delay" [16]. Nevertheless, the directive introduces human judgment and approval in this cycle, but does not restrict technology. It requires “hardware and software verification and validation”.  Mark Gubrud describes the already existing “small autonomous missiles such as the “Low Cost Autonomous Attack System” (LOCAAS), which is delivered by a fighter bomber powered by a turbo-jet engine for 30 minutes. In his UN-Report, the Special Rapporteur on Extrajudicial Executions listed existing weapon systems, which have various degrees of autonomy, such as “fire-and-forget weapons”, object defense weapons (PHALANX) or sophisticated drones (US: X-47B; UK: TARANIS) [17]. In March 2012, the Naval Research Laboratory introduced a new facility for autonomous systems research with an artificial combat environment including forests, a desert and buildings for urban warfare. Fully autonomous combat systems must be capable of learning or adapting their mission in response to a changing environment. Some authors are arguing that: “A fully autonomous capability, in which the unmanned vehicle will generate and perform multifaceted missions, is unattainable until true artificial intelligence (AI) technology becomes available” [18]. Some estimate that this situation is 10-15 or more years away. There is not much doubt that these developments are already under way.

Restrictions by arms control and the International Humanitarian Law (IHL)?

Two main tools can be applied to the future development and employment of unmanned combat systems: the arms control approaches and the International Humanitarian Law (IHL).The former is designed to prevent the introduction of weapon systems, while the latter applies to actions during an armed conflict.

The State of Arms Control and UMS

Current developments in the way wars are fought certainly have consequences for the stability and potential extension of existing legally-binding arms-control regimes. Several modern accords and conventions, mostly negotiated and entered into force during the Cold War, limit or prohibit entire classes of weapon systems, such as biological weapons (Biological Weapon Convention, 1972), chemical weapons (Chemical Weapon Convention, 1983) or intermediate nuclear forces (INF Treaty, 1987) where specific delivery systems were banned from Europe. During the Cold War, it became clear that the effectiveness of arms-control regimes can be bypassed by technological innovation and proliferation. Hence, the entire sophisticated conventional-arms-control edifice, which was basically built on quantitative criteria, such as agreed ceilings of major weapon systems and sophisticated verification agreements, may start to crumble if the new elements of modern warfare are not taken into account.

Nevertheless, arms control has to reflect the profound changes of the post-Cold-War, globalized world. It must, therefore, become more flexible and more comprehensive and include a wider range of criteria, options and instruments. One approach would be to include UMS in the region of the Atlantic to the Urals in the 1990 Conventional Forces in Europe (CFE) Treaty. The basic concept of the CFE-Treaty was to achieve “a secure and stable balance” and to eliminate “the capability for launching surprise attack and for initiating large-scale offensive action in Europe”. The CFE Treaty is based on quantitative limits on five major weapon systems for different regional zones. Unfortunately, the CFE Treaty was suspended first in 2007 by Russia and later by NATO. The follow-up accord should also include new weapon technologies, which can alter the military balance or which can increase instabilities in a crisis. In the future, it will become more and more obvious that the density and effectiveness of military forces cannot be measured simply “in numbers of tanks and fighter aircraft”, but that other categories such as cruise missiles, UAVs and perhaps other robotic systems or autonomous vehicles will have to be included.

Another approach is to strengthen risk reduction by transparency and confidence-building measures (TCBM): The politically-binding 1999 Vienna document, which is part of the CFE treaty regime can be used to exchange data on the introduction of new types of weapons such as UMS on an annual basis. A Consultative Commission on the OSCE level can meet annually to discuss military relevant R&D, which have a significant effect on military stability. Other options would be to include the international registration of UMS in the UN Weapons Register or the newly established Arms Trade Treaty (ATT). Since 1991, the UN Register has organized reports on arms transfers as well as information on holdings, domestic purchases and relevant policies from more than 170 States [19]. The ATT, which was approved on 2 April 2013 in the UN General Assembly, was created to regulate the international trade in conventional arms, combat aircraft and warships. Including unmanned combat systems, such as UAVs or UGVs, in the ATT could also help to restrict the destabilizing flow of new weapon systems into conflict regions. Supply side arms export regimes, such as the Missile Technology Control Regime (MTCR, 34 members), restricts the transfer of delivery systems, among them “complete unmanned aerial vehicle systems capable of delivering at least a 500kg payload to a range of at least 300km”. The Wassenaar Agreement (40 members) also restricts dual-use goods “to regions and states with situation/behavior representing serious concerns” to the 40 member states. These Western-oriented export regimes, which are linked to the national laws of the respective member states, can restrict the transfer into conflict regions and slows down indigenous developments of newcomers in the field of UMS.

As the very concept of preventive arms control suggests, not only quantitative aspects of military forces, but also future technical developments should be taken into account. This broadens the scope of arms control into the area of military related R&D. Preventive arms control aims to avoid costly and dangerous technology-driven arms races by preventing the deployment of new weapon technologies on the battlefield [20]. A prospective scientific assessment and military-operational analysis of the technology in question are necessary under specific criteria such as (1) adherence to and further development of effective arms control, disarmament and international law, (2) maintaining and improving stability, and (3) protecting humans, the environment and societies. Based on such an assessment, a ban on or limitations of military usable technology or weapons systems before acquisition or deployment should be considered. A variety of complete bans on specific types of UMS have been proposed by Jürgen Altmann 2013 [21].

Given the current state of the existing arms control treaties and the asymmetric structure of world politics, it is doubtful that states will agree on total bans of UMS, given the level of proliferated technologies and the challenges of verification.

International Humanitarian Law (IHL) an Unmanned Systems

The regulations of new means of warfare have been developed by organizations, such as the International Committee of the Red Cross (ICRC), and international lawyers along the lines of the last 150 years. The emerging use of armed UAVs and other unmanned systems can dramatically change warfare and needs further regulations. There is no doubt that the IHL also applies to this new weaponry. IHL was designed as a result of the recognition that the “imperative of humanity” imposes limits on the choice of weapons in an armed conflict. One main principle is to protect civilians from war hostilities and “to protect combatants against weapons of a nature to cause superfluous or unnecessary suffering” [22]. The main principles and rule are enshrined in the four Geneva Conventions from 1949 and two Additional Protocols from 1977.

NGOs, such as Human Rights Watch (HRW) or the Pugwash Conferences on Science and World Affairs, are engaged in international activities to address the future use of drone technology. Pugwash is working with the UNESCO World Commission on the Ethics of Scientific Knowledge and Technology (COMEST) [23] to develop a set of relevant principles. HRW has started a “Campaign to Stop Killer Robots”, which tries “to ban lethal robot weapons that would be able to select and attack targets without any human intervention" [24].

The main principles of IHL are distinction between combatants and civilians and proportionality, which must be observed by the soldiers and operators in an armed conflict situation.  There is some concern about how and by whom these automated systems are operated. Are these soldiers, trained in IHL regulations or civilians, including employees of private companies?

It can be argued that the distinction in a remote situation is complex and not error-free. There might be time delays in signal transmission or insufficient data available. Flying a drone is like “flying a plane looking through a straw”. Another key question is whether a targeted combatant would have the chance to surrender to a combat robot. In addition, lethal autonomous systems do not have the capability of distinguishing between combatants and civilians. A lethal autonomous combat system also cannot distinguish between intentions or human behavior on the battlefield, for example, whether a belligerent is wounded or is trying to surrender [25]. Other experts, such as Ron Arkin of the Georgia Institute of Technology, argue optimistically that the use of autonomous systems will lead to better ethical behavior on the battlefield because, inter alia, machines will have better capabilities in terms of observation, identification and fast decision-making. In addition, ethical behavior could be programmed and included in the automatic decision process [26].

The other main principle, the rule of proportionality, is a precaution to assess whether the expected harm to non-combatants will be measured against the anticipated military advantage to be gained. Machines do not have this contextual judgment capability. In his report the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christopher Heyns, underlined that fully autonomous weapons raise multiple moral, legal, policy, technical questions and other concerns [27].

The second IHL approach consists of international agreements which prohibit or restrain the use of specific weapons, such as cluster munitions, anti-personnel land-mines, blinding lasers or incendiary weapons under the Convention on Certain Conventional Weapons (CCW).

After informal consultations in the context of the CCW in Geneva, state parties adopted a mandate to hold a three-day informal meeting of experts to discuss “the questions related to emerging technologies in the area of lethal autonomous weapon systems” in May 2014 [28]. There is hope that the complex ethical, legal and technical questions can be solved and that fully-autonomous combat systems will be prohibited.


[1] Quoted after: Singer, Peter (2009), Wired for War, Penguin Group, London, p. 426.

[2] Fogleman, Ronald Gen., U.S. Air Force Chief of Staff cited after Thayer,  A. (2000), ‘The political effects of information warfare: Why new military capabilities cause old political dangers’, in Security Studies, vol. 10(1), pp. 43–85, p.44.

[3] Reppy, Judith (2006), ‘Managing Dual-Use Technology in an Age of Uncertainty’, in The Forum, vol. 4, no. 1, Article 2, < forum/vol4/iss1/art2>.

[4] US DoD 2007, pp. 97-102.

[5] Quintana, Elizabeth (2008), The Ethics and Legal Implications of Military Unmanned Vehicles, Occasional Paper, British Computer Society, p. 2.

[6] Gertler, Jeremiah (2012), U.S. ‘Unmanned Aerial Systems’, CRS Report for Congress, Congressional Research Service, January 3, Washington D.C., p. 7.

[7] Gertler 2012, p. 2.

[8] Bureau of Investigative Journalism 2013.

[9] Byman, Daniel (2013), ‘Why drones work’, in Foreign Affairs, July/August, < 139453/daniel-byman/whydrones-work>.

[10] Cronin, Audrey Kurth (2013), ‘Why drones fail’, in Foreign Affairs, July/August, <…;.

[11] Shane, Scott (2011), ‘Coming Soon: The Drone Arms Race’, in New York Times, 8 October.

[12] Ibid.

[13] US Department of Defense (2007), Unmanned Aircraft Systems Roadmap 2005-2030, Washington, D.C.

[14] See SIPRI Yearbook 2003: Armaments, Disarmament and International Security (2003), ‘New developments in unmanned air vehicles and land-attack cruise missiles’, Chapter 12, Oxford, pp. 409–432.

[15] Department of Defense (2013), ‘Autonomy in Weapon Systems’, Directive (3000.09), Item 4a, 21 November.

[16] Gubrud, Mark (2013), ‘US Killer Robot Policy: Full Speed ahead’, in Bulletin of the Atomic Scientists, 12 September 2013.

[17] Heyns, Christof (2013), ‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’, Human Rights Council, A/HRC/23/47, 9 April 2013, p. 9.

[18] Quintana 2008, p. 5.

[20] Neuneck, Götz (2008), ‘The Revolution in Military Affairs. Its Driving Forces, Elements and Complexity’, in Complexity, vol. 50, pp. 50-61.

[21] Altmann, Jürgen (2013), ‘Arms Control for Armed Uninhabited Vehicles: An Ethical Issue’, in Ethics Inf. Technol., 21 March 2013.

[22] ICRC: 1949 Conventions and Additional Protocols, and their Commentaries,….

[24] Campaign to Stop Killer Robots, <;.

[25] Sharkey, Noel (2010), ‘Saying 'No!' to Lethal Autonomous Targeting’, in Journal of Military Ethics, Special Issue: Ethics and Emerging Military Technologies, vol. 9(4), December 2010, pp. 369-383.

[26] Arkin, Ronald (2010), ‘The Case for Ethical Autonomy in Unmanned Systems’, in Journal of Military Ethics, Special Issue: Ethics and Emerging Military Technologies, vol. 9(4), December 2010, pp. 332-341.

[27] Heyns 2013.

[28] Horner, Daniel (2013), ‘Meeting Set to Discuss Autonomous Arms’, in Arms Control Today, December 2013, <;.