A Comprehensive Approach to
Countering Unmanned Aircraft Systems
Part IV – Legal Perspectives
Arms Control of Unmanned Weapons Systems
Facing the Challenges
By Dr Christian Alwardt, GE
Institute for Peace Research and Security Policy at the University of Hamburg
Research and development within modern information technologies has resulted in rapid technological progress that is also reflected in the strategic and tactical considerations of military and security policy decision-makers. Efforts to advance the automation and digitalization of warfare can be observed in a growing number of countries over the past few years. A prominent example of this trend is the development, procurement and employment of Unmanned Weapons Systems (UWS). Today, especially Unmanned Aircraft Systems and drones are becoming more and more widespread and commonly used throughout the international community as they are setting the pace for progress. Aside from the military advantages that can be expected, UWS also raise a number of new issues with regard to the danger of armament dynamics, as well as the destabilizing effects of these weapons, and their legitimacy under international law. How should the advantages and disadvantages of UWS be balanced, and how could adverse consequences be limited? Due to their specific nature, UWS pose a challenge for international arms and export controls.
UWS are highly complex and comprised of a variety of technological components. It is no longer hardware alone, but to an increasing degree system control software, the interaction with external infrastructure, and the synergies resulting from the interaction of various technological components that decisively define the military capabilities of unmanned systems. Thus, military effectiveness arises rather from ‘system complexity’ and is less due to the individual components of an unmanned system. Most of the technological components of UWS are so-called dual-use technologies, which predominantly originate from civilian developments. The proliferation of this civilian dual-use technology and its military employment is very difficult to regulate or track, which poses significant problems for export control measures. On the other hand, fully civilian unmanned systems can also have the potential for military applications and, possibly following some modifications, could be employed as weapons carriers. Only a few unmanned systems have unique and unalterable military attributes, or a design that obviously implies that they are purely used for military purposes (an example would be the X-47B prototype of a stealth drone). High payload capacities of unmanned systems used to be a reliable indication of an intended armament and military employment. However, many of the civilian systems are now also designed in the first place as carrier systems with high payloads, which is mainly due to their increasingly broad range of tasks, such as scientific research and the transport of goods. Obvious indications for the military employment of an unmanned system would be existing armament, weapons stations, respective targeting systems, or attack control software. However, sometimes it is possible to simply remove or replace weapons payloads and software, so that only an individual snapshot of the system’s actual configuration may point towards the intended use of an unmanned system. By taking such measures, military unmanned systems could also be concealed as civilian systems, for example. A purely external, unambiguous distinction between civilian and military unmanned systems is therefore considerably exacerbated by their system complexity and dual-use character.
With absolute certainty, the military character of some unmanned systems will probably only become apparent through their use and the resulting effects. Such a potential ‘military indeterminacy’ of unmanned systems confronts the traditional verification approaches of arms control with almost insurmountable obstacles, and an apparently continuous verification of unmanned systems for their civilian or military character by using the existing instruments seems impossible or infeasible. So far, traditional arms control has been based mainly on the numerical, regional, and type-related limitations of clearly defined and unambiguously identifiable weapons categories. In most cases, verification of arms control agreements was performed by detecting and counting weapon systems. By using such traditional instruments of arms control, it will be possible to detect UWS today only if they expose unique and unalterable military characteristics, as it is the case for, e.g. manned tanks, combat aircraft, or warships.
For this reason, arms control of modern UWS has to face new requirements, and thus, is in need of conceptual adaptations. Proven approaches should be adapted to the extent possible, and supplementary instruments should be developed where required. New ideas must be elaborated, and creative solutions applied. With regard to UWS, future arms control must, on the one hand, cope with the dual-use issue and address the blurred borderline between civilian and military systems, and, on the other hand, must be flexible enough to respond to new technological development trends. It is becoming apparent that in the future, software (e.g. program codes, algorithms, data) will affect and define weapon systems performance more greatly than hardware. In conjunction with this knowledge, a critical consideration of the potential impacts of increasingly software-supported decision-making processes, which will be a consequence of increasing automation, is strongly recommended. In this context, the risk of a possible loss of human control over future UWS requires special attention, which must be considered with regard to both the provisions of international humanitarian law and the security policy implications of these weapons.
Arms and export control are still limited, particularly regarding the various software components of modern weapons systems. As yet, there is a lack of reliable instruments that can be used to regulate and verify software codes, algorithms and data sets, and that are capable of gaining international consent. Very similar circumstances and also overlaps can be seen with regard to arms control efforts in the cyber domain and in space. In all these fields, arms control research and the international discourse by experts are only just beginning, and have to this point received too little attention.
Above all, there is still a lack of international awareness of the risks and security policy implications of UWS. In this regard, the essential lessons from the East-West confrontation era should be recalled: On the one hand, the negative lessons learned and hazards that originated from the arms race and the potential for military escalation. And on the other hand, the stabilizing value and mutual benefit of cooperative arms and export controls. Both aspects must be likewise taken into consideration, so that a serious interest by all parties in regulatory measures within the field of UWS may evolve.
The path to the future regulation of UWS is a rocky and challenging one. Coordinated cooperation within the community of nations, under participation of decision-makers, technological experts and scientists, will be vital to achieving that objective. In the course of this process, regulatory approaches developed, negotiated and decided on the basis of international discourse must be repeatedly put to the test. The leap of faith provided within the scope of arms control agreements must be substantiated by agreed and reliable verification instruments.
For such a future arms control process, the underlying challenges, preconditions required, and some bold starting points will be out-lined below.i
Strengthening the International Discourse and Establishing the Required Basics
As a subject, UWS are ‘difficult to grasp’. So far, there has been no generally accepted definition or classification nor is there a clear civilian-military distinction of unmanned systems. The international discourse on the future military employment of UWS, the resulting risks, consequences as well as the regulatory need has come to a standstill. The causes for these difficulties lie, among others, in:
- the dual-use character of the individual technological components;
- a missing basis of information – such as insufficient knowledge of characteristics, synergies, and capabilities originating in the underlying technologies;
- the use of different terms and notions;
- deviating political interests that in some cases contradict the regulatory efforts.
A mutual understanding of what, on the one hand, defines UWS in a broader sense, and on the other hand, how they might be more easily classified would be an important foundation and precondition for a purposeful international discussion on arms control and the non-proliferation of UWS. A discourse on the conformity of fully autonomous weapons systems with the international humanitarian law has been ongoing since 2014 within the scope of the UN Convention on Certain Conventional Weapons (CCW). This discourse, however, has also been characterized by the difficulties described above. A broader international debate on the peace and security policy implications of these weapon systems is still pending. At the beginning of such a debate, awareness must be raised, whether and to what extent, today’s and/or future UWS will impact international security, jeopardize regional and/or strategic stability, and expedite armament dynamics. A comprehensive international consensus on the type and effects of the negative consequences which may be associated with the increasing proliferation and employment of UWS will be the basic precondition and motivation to commence future negotiations on arms and export controls, and to later promote them successfully.
Verification: Arms Control is Particularly a Matter of Trust
The success of arms control significantly depends on the contracting nations’ trust in compliance of the agreements made and their verifiability. For this purpose, it is essential to clearly determine in advance what exactly should be regulated, and how this can be validated. This is the traditional approach within arms control.
However, considering the missing definitions and clear civilian-military distinctions, this proves to be difficult with regard to UWS. Reliable verification of the quantity of unmanned weapons is barely possible in this manner. The military characteristics of unmanned systems are also difficult to validate as they are less linked to distinct military attributes but rather emerge from the interaction of various dual-use technologies, the system software in particular. With regard to weapons systems, this dual-use issue is not a new one, however, as it relates to unmanned systems, it is particularly severe. Distinct identification of certain UWS and verification of their military capabilities (such as the level of automation or the abilities to get a situational picture) are yet unresolved problems that require new approaches as well as an increased awareness of the problem space itself.
On the one hand, in order to identify appropriate starting points for future arms control and (technical) verification instruments in the area of unmanned systems, knowledge and understanding of software technologies and their increasing share in the build-up of capabilities and the automation of unmanned systems must be consolidated. On the other hand, the potential paths of conversion of civilian to military unmanned systems should be mapped and analyzed in technological terms. In this way, essential insight may be gathered as to where future regulatory and verification measures can best be applied.
A potential solution might also be to conduct ‘in-depth’ examinations, which could lead to more detailed detection and analysis of the individual system components – maybe even of the system software. Based on this, more accurate conclusions might be drawn as to the military potential and the characteristics of an unmanned system. However, the fact that nations want their weapon systems to be understood as ‘black boxes’ within the scope of arms control – means, they do not allow profound insights into the system or its functioning – makes a broad acceptance of this type of verification seem unlikely. Moreover, such time-consuming verification might quickly reach the limits of its feasibility in case of a large number of systems to be verified.
Further, discourse on arms and export control of UWS should not pause simply because there are no reliable verification instruments yet. In order to counter the dilemma of confidence, at least in the beginning, and to advance the discourse, voluntary transparency measures carried out by those states developing and/or operating UWS could be an interim solution to establish confidence.
Helping to Resolve the Conflict Between ‘Autonomy’ and ‘Control’
In the course of nearly any discussion about the future development of UWS, the terms of ‘automation’ or ‘autonomy’ are used at some point and the conditions for adequate ‘human control’ are debated. There is no international consensus today as to what is meant by ‘autonomy’ or ‘human control’, nor as to which system functions these concepts apply and, above all, how they relate to one another. It seems obvious that there is a conflict between the concepts of ‘autonomy’ and ‘control’. Any attempt to clearly define these terms on an international level and in a manner generally applicable to UWS failed thus far and seems to be a hopeless pursuit in moving forward. Instead of continuing to focus on terminology, the discourse on UWS could also be based on potential scenarios of deployment. Such a scenario, for example, could be drawn on the following four factors, each to be assessed in relation to the weapons system used and to be weighted with respect to one other:
[F1] the technological capabilities of the weapons system (determined by hardware, software, and supporting infrastructure);
[F2] the operational context (objective and sturdiness of the military operation);
[F3] the complexity of the operational environment (such as battlefield or dynamic urban environment);
[F4] the level of human control (e.g. with regard to system guidance, the generation of a situational picture or dedicated decision processes, such as target selection and weapons employment).
By analyzing the combination of these different factors, the objective would now be to identify deployment scenarios that are problematic on legal, ethical or security policy levels, and to regulate them accordingly. An example scenario would be the employment of an UWS featuring only simple sensor equipment and limited analytic computing performance (F1) that should be used to identify and eliminate enemies (F2) in an urban environment (F3), but is not operated or monitored by a human operator at the same time (F4). Now, the question arises whether a problematic scenario of deployment will emerge from this combination of factors, and if so, how could such a scenario be avoided by applying specific regulations?
Such a scenario approach requires detailed elaboration, and in particular, must be feasible. Preliminary work that might be helpful with regard to assessing and describing the factors has already been done.1 Building upon this, it may be possible in a second step to better comprehend the conflicting relationship of autonomous acting and ‘human control’ of unmanned systems, and to differentiate operational forms of ‘autonomy’ from one another without getting lost in rigid definitions. In addition, it might be possible in such a manner to find feasible solutions that may serve to ensure the required measure of ‘human control’ for unmanned systems.
Traditional Arms Control – Regulation of Today’s and Consideration of Future UWS
In the past decades, arms control of conventional weapon systems and weapons of mass destruction had been subject to a number of bilateral and multilateral arms control treaties. For example, strategic nuclear weapon systems were limited (New START treaty),2 carrier systems were disarmed (INF treaty),3 or conventional weapon systems were limited in terms of their numbers and locations of employment (CFE treaty).4 There are also a number of transparency- and confidence-building measures, such as the Vienna Document or the UN Arms Register.5 In most cases, these treaties do not explicitly distinguish between manned and UWS, which means that – at least in theory – they are applicable to today’s UWS. Unfortunately, these traditional arms control agreements have been increasingly questioned for some years now; once pioneering and successful agreements such as the CFE treaty or the INF treaty are already a matter of the past. Nevertheless, today’s UWS should be examined with regard to the options for their regulation in terms of traditional arms control. In case of the agreements still in existence, doubts as to their scope must be eliminated and, where necessary, specific additions to today’s UWS must be made. Although this step may initially only be successful with regard to those UWS that have a clear military typology (main weapon categories, such as tanks, combat aircraft), it would be an important measure to strengthen the overall basis of trust in conventional arms control. Within the framework of current efforts to revive conventional arms control, e.g. in the form of a CFE successor agreement for Europe, current and future UWS must be addressed from the outset. The issues that will arise in connection with the dual-use character of the underlying technologies and the issues in differentiating between civilian and military unmanned systems, but above all whether and how conventional arms control can keep pace with technological developments in the future must be clearly addressed in this instance. Within the scope of nuclear arms control, for example, in the negotiations on an extension of the New START treaty or a successor agreement, new unmanned carrier systems or future hypersonic weapons must be taken into consideration in addition to existing carrier systems, such as strategic bombers and ballistic missiles.
With development and automation progressing, UWS will no longer fit specified frames as clearly as previous main weapon categories. In addition to the dual-use issue, the reason for this is their non-physical military potential, which will be determined less by hardware and more and more by software and networking features. For this reason, the categorization of weapons systems would have to be re-considered to take these additional criteria into account, and to be able to design new types of verification measures based on them.
Arms control could increasingly move towards regulating military potentials as a whole instead of the number of previously defined major weapon systems as in the traditional sense. To this end, unmanned systems would in the future have to be categorized on the basis of their individual military capabilities and characteristics in order to determine their individual military potential. Such an assessment would be possible by examining their design and technological components. Conclusions could be drawn on (1) general performance parameters (velocity, agility, range), (2) armament capacity (depending on design and payload), (3) special military characteristics (stealth, armour) and (4) the individual degree of automation or autonomy (based on software, sensor technology, data links). However, it seems rather unlikely today that nations will grant such deep insight into their military systems. Conclusions on the military potential could also be drawn by performing an external inspection of the unmanned systems (with regard to design, size, special characteristics) and a systematic demonstration of their capabilities. One approach to this could be a ‘Weapon Review Process’ developed and standardized on an international level, as is already in place in many cases on a national level. In this manner, however, the capabilities and characteristics of unmanned systems could only be estimated by tendency, and there is generally a very high potential for deception. It is also not clear yet how individual capabilities would have to be evaluated, or how military potential would ultimately be calculated. It will not be possible to assess the software capabilities of an UWS from the outside. In addition, such an approach would probably require a very large number of single unmanned systems to be examined and verified individually for their capabilities and characteristics, which would quickly push arms control to its capacity limits. So far, traditional arms control will probably only be feasible with regard to UWS if military potentials can be clearly identified on the outside and verified with existing methods, but this issue should not prevent us from ‘re-thinking’ arms control.
Developing New Approaches to Regulation
Traditional arms control must be extended by new approaches to meet the new challenges. Rules of engagement for UWS applicable throughout the world might be a reasonable and effective addition to arms control. The particular attraction of such rules is that they would not be applied directly to an unmanned system or its individual technological components, but would regulate the use and operational context of unmanned systems in general, with the objective of regulating certain military effects. Such an impact- and context-based regulation would not be confronted with the dual-use issue and the difficulties of categorizing UWS. An international arrangement on clear rules of engagement could thus help contain the potentially destabilizing effects of the use of unmanned systems in terms of arms control. On the one hand, the result of such rules could be the deceleration of machine-based decision-making processes with regard to certain military operations, which would otherwise be too dangerous to rely on being time-critical. On the other hand, it could be agreed that certain analysis processes, conclusions or decisions will be reserved for human beings only, that operation with UWS in urban areas will be restricted, or that military operations must take place exclusively in a battlefield environment. There are three potential categories for rules of engagement: (1) spatial restrictions in deployments, (2) operational restrictions in deployments, and (3) capability restrictions. Such rules of engagement also offer a link to the international discourse on autonomous weapon systems within the scope of CCW, where demands for ‘meaningful human control’ or an ‘appropriate level of human judgment’ are already being discussed. So far, there is a lack of workable concepts of implementation. An essential challenge is the verification of compliance with the rules of engagement. Up to now, there have been no proven verification mechanisms for this. Initial considerations have been made as to how, for example, an independent, international monitoring of the employment of UWS could be made workable.6 There is still a considerable need for research on this complex set of issues.
Regulation of Weapons-relevant Key Components
In most cases, the focus of traditional arms control is on entire weapons systems. However, modern weapons systems are becoming increasingly complex, and consist largely of dual-use components, which makes their regulation as a whole more difficult. However, if specific technological components or armaments could be identified that are significant for certain categories of UWS and their military capabilities, these key components alone would create a lever for arms control regulations. In the future, arms control could then concentrate to a greater extent on these weapons-relevant key components instead of a complex weapons system per se. Examples of the specific identification and description of relevant key components can already be found within export control. For instance, the Missile Technology Control Regime (MTCR) lists subsystems and technological components that are relevant for the construction of military ballistic missiles or drones, and are therefore subject to export restrictions or bans.7 Applying arms control to weapons-related key components could be an interesting approach that could benefit from previous lessons learned in export control.
Prohibition and Ban of Certain Future Weapons Developments
The need for arms control is often only realized when weapons systems have already been fielded, and their implications for security policy and international law can be observed immediately. The traditional approaches to arms control, therefore, revolve primarily around the regulation and limitation of existing weapons systems. However, once weapons systems have been stationed ‘throughout the world’ and on a broad scale, it is much more difficult to prohibit or regulate them. Nevertheless, there are also arms control approaches which aim at intervening in the development of certain weapons or prohibiting them in advance. On the one hand, there are the negotiations within the scope of the UN Convention on Weapons (CCW), whose task it is to discuss the unlawfulness of certain weapons or weapon applications, and, if necessary, to prohibit them, as it was the case, e.g. with regard to blinding laser weapons. From this point of view, CCW has also been holding expert talks on Lethal Autonomous Weapon Systems (LAWS) since 2014. On the other hand, the concept of Preventive Arms Control (PAC) provides an approach that focuses on the security policy implications of future weapons.8 The objective of PAC is to identify in advance the potentially destabilizing consequences of future UWS and to regulate their technological research as well as development paths in such a way that critical systems will not even be fielded and employed.
In both cases, an awareness of future UWS risks under international law or security policy must first be raised on a broad international front and their containment must be seen as a win-win situation for all parties concerned. In addition to the willingness to engage in intensive political debate, this process requires in particular a comprehensive analysis and technology impact assessment of both new weapons technologies and future UWS.
In the end, it is conceivable, that automated UWS that are capable and intended to make a life-or-death decision in a mission without human intervention could be prohibited or banned. Conversely, the question arises under what circumstances would sufficient human control be guaranteed? The current discourse on meaningful human control reveals numerous obstacles and shows how difficult it is to achieve a common international understanding of this. However, a pragmatic solution could also lie in encouraging UWS to be programmed in such a way as to encourage rule-based and fundamental protection of human life. Such ‘laws on robotics’ were first formulated in literature by Isaac Asimov.9 An automated UWS could thus be prohibited from self-directed target analysis or selection, and the killing of humans in general. However, a human operator would still be free to take full control of the system to carry out a lethal weapons employment. The difference to many previous control approaches is significant. In most cases, the human operator only had the right of veto, for example to stop a machine target selection or an automated weapons employment. However, if UWS are programmed to protect human life in all cases of doubt, this programming would have to be overridden for a lethal attack by a human being consciously taking over full control of the unmanned weapons system. It is then up to human operator to analyze the situation, determine goals, and initiate the employment of weapons, which means that in the end the human really bears the direct responsibility for the act of killing. However, ensuring that such programming of UWS is permanently secure and verifiable is not trivial, and so far, trusted ideas for implementation have been lacking. For this reason, Isaac Asimov’s laws on robotics will remain science fiction for the time being.
Limitation of the Proliferation?
Over the last decade, unmanned military systems have experienced considerable proliferation, and the number of actors operating with UWS has steadily increased (horizontal proliferation). In addition, a number of nations are making increased efforts to further develop the technology of UWS, and to advance their automation. The progressive integration of these systems into various military domains and the emergence of new spheres of operation can also be observed in this context (vertical proliferation). The steady proliferation of UWS goes hand in hand with the proliferation of their security risks.
The commitment and interest of the international community in the non-proliferation of UWS has not been very strong so far. On the contrary, for some years now there has even been an erosion of existing export control regimes (e.g. the MTCR) since economic interests in the export of UWS are gaining increasing influence in many nations.10 In addition to the export of complete military unmanned systems, it is primarily dual-use technologies that contribute significantly to the proliferation and military adaptation of unmanned systems, and open up various proliferation paths. How this dual-use issue can be better addressed in export control is thus far an unsolved problem. A starting point for this could be provided by previous lessons learned from the ‘general-purpose’ criteria used in the field of biological and chemical weapons regulation. It is also conceivable that an international export monitoring system could be established to track the global export/import of such dual-use key components that are not only significant for the development of UWS but also considered problematic. If a critical combination of such key components were then assigned to a single actor, this could be interpreted as an indication of the desired development of such a weapons system. As a reaction, international export restrictions on the actor and related dual-use technologies could then be put into effect. To this end, however, it is first necessary to clarify internationally, which kind of UWS should be regulated, and which would be the critical combination of key components. It should always be borne in mind that export controls can quickly develop a discriminatory character if they do not apply equally to all nations, and thus may lose their legitimacy.
The proliferation and automation of UWS is increasing worldwide. While the international legal dimension of the use of autonomous weapon systems has been discussed for several years within the scope of the UN CCW, the security policy implications of UWS have so far lacked the necessary attention. In a time of increasing international tensions and a weakened international security architecture, the security policy risks potentially associated with UWS pose a particular threat to stability and peace. Arms control can successfully contain this danger – a lesson learned from the Cold War, but which now seems to have been forgotten. However, the traditional approaches to arms control alone can no longer meet the requirements posed by modern UWS. Arms control must, therefore, be rethought and should not hesitate to take bold steps. But there is no reason to gloss over anything; this will be neither an easy, nor a fast process. In this respect, many questions are still pending, and creative solutions must be worked out, so that confidence in arms control can also be ensured in the future. Some more or less realistic approaches have been presented here. An important first step would be to raise awareness of the security policy risks of UWS and to recognize the benefits and necessity of arms control internationally. If there is the will to do so, then answers to the many open questions will be found – this much mankind has always proven in the past.
i. The following sections are based on considerations of the author, which were first published in: Alwardt, Christian. (2019). Unbemannte Systeme als Herausforderung für die Rüstungs- und Exportkontrolle. In: Werkner IJ., Hofheinz M. (eds). Unbemannte Waffen und ihre ethische Legitimierung. Gerechter Frieden. 85-109. Springer VS, Wiesbaden.
1. For examples, see: Dickow, Marcel, Anja Dahlmann, Christian Alwardt, Frank Sauer & Niklas Schörnig (2015). First Steps towards a Multidimensional Autonomy Risk Assessment (MARA) in Weapons Systems, IFAR Working Paper 20, Dec. 2015, https://ifsh. de/file-IFAR/pdf_deutsch/IFAR-WP20.pdf (Accessed 7 Aug. 2020). / Alwardt, Christian & Martin Krüger (2016). Autonomy of Weapon Systems, Food for Thought Paper, Feb. 2016, https://ifsh.de/file-IFAR/pdf_english/IFAR_FFT_1_final.pdf (Accessed 7 Aug. 2020).
2. New START Treaty, for further information see: https://www.armscontrol.org/factsheets/NewSTART (Accessed 7 Aug. 2020).
3. The Intermediate-Range Nuclear Forces (INF) Treaty, for further information see: https://www.armscontrol.org/factsheets/ INFtreaty (Accessed 7 Aug. 2020).
4. The Conventional Armed Forces in Europe (CFE) Treaty, for further information see: https://www.armscontrol.org/factsheet/cfe (Accessed 7 Aug. 2020).
5. For Vienna Document, see: https://www.osce.org/fsc/86597 (Accessed 7 Aug. 2020) & for UN Arms Register, see: https://www. un.org/disarmament/convarms/register/ (Accessed 7 Aug. 2020).
6. Altmann, Jürgen & Mark Gubrud (2013). Compliance Measures for an Autonomous Weapon Convention, ICRAC Working Paper #2, May 2013, https://www.icrac.net/wp-content/uploads/2018/04/Gubrud-Altmann_Compliance-Measures-AWC_ICRAC-WP2.pdf (Accessed 7 Aug. 2020).
7. For examples refer to the Missile Technology Control Regime (MTCR) website, https://mtcr.info/mtcr-annex/ (Accessed 7 Aug. 2020).
8. Mutz, Reinhard & Götz Neuneck (eds.) (2000). Vorbeugende Rüstungskontrolle. Ziele und Aufgaben unter besonderer Berücksichtigung verfahrensmäßiger und institutioneller Umsetzung im Rahmen internationaler Rüstungsregime, Nomos, Baden-Baden.
9. Asimov, Isaac (1942). Runaround, Astounding Science Fiction, Vol. 29, Issue 1, 94–103.
10. CNAS (2017). Drone Proliferation–Policy Choices for the Trump Administration, Center for a New American Security, Jun. 2017, http://drones.cnas.org/wp-content/uploads/2017/06/CNASReport-DroneProliferation-Final.pdf (Accessed 7 Aug. 2020).
Dr Christian Alwardt
Dr Christian Alwardt is a senior researcher in the ‘Arms Control and Emerging Technologies’ project at the IFSH, where he also heads the research project ‘Algorithms and Artificial Intelligence as Game Changers?’. Christian Alwardt has been conducting technology assessments since 2008, with his research focusing on topics of security and arms control policy, as well as the future of warfare. He is actively involved in expert committees, acts as a media contact and advises ministries and committees. Christian Alwardt initially studied physics, business administration and international relations at the University of Hamburg and graduated with a degree in physics. As part of his research at the CLISAP Cluster of Excellence, he received his doctorate in natural sciences.