The opportunities and risks associated with using autonomous aircraft for military purposes
Drones are now being used by the majority of armies in the world, especially in the West where they are considered indispensable. The primary application for most drones is reconnaissance on all kinds of missions: strategic, operational, and tactical. Like any other kind of military instrument, using drones offers advantages but also presents risks. TRANSFER spoke about both sides of the coin, how to deal responsibly with these issues, and a variety of other topics with Dr. Olaf Theiler, the section head of future analysis at the Bundeswehr planning office. Our interview took place at the #techourfuture event on the future of autonomous flying.
There are currently a variety of drone models on the market for military use, all matched to different application scenarios. There are large devices such as the Global Hawk produced by Northrop Grumman; this is used for strategic reconnaissance purposes and is sent on long-haul flights at high altitudes during monitoring missions. For operational missions, typically medium-altitude drones capable of flying for up to 50 hours are used, such as the IAI Heron produced in Israel. Then there are a multitude of smaller drones, which are primarily used for tactical, near-range reconnaissance. A good example of this is the LUNA drone used by the Bundeswehr (the German Federal Armed Forces). With a wing span of roughly four meters and a two-meter fuselage, it is relatively large and has a reconnaissance range of approx. 100km. In the meantime, there are also a whole host of smaller drones for providing short-term local reconnaissance data, from miniature UAVs (unmanned aerial vehicles) to nano-drones. The Bundeswehr uses the EMT Aladin in this range; it has a wingspan of approx. 150cm. There are also even smaller versions on the market, some no bigger than the palm of a hand, although the range and flight duration they offer is correspondingly lower.
AREAS OF APPLICATION FOR MILITARY DRONES
All of these reconnaissance drones are primarily intended to protect soldiers during military operations. They make it possible to monitor areas and routes without risking the lives and limbs of military personnel. Drones can also be used to accompany transportation missions and patrols, and thus protect personnel from ambushes. Some countries use drones for offensive operations, however – in other words they use armed drones, which can be steered remotely to attack moving targets. The best-known examples of this are the US Predator drone, which has already been withdrawn from service again, and its successor, the Reaper.
A variety of military forces and intelligence services use comparable systems due to their enhanced accuracy during weapon deployment, although that is not the case in Germany or with the Bundeswehr. In the fall of last year, and for the first time ever, the German government agreed to buy drones for the Bundeswehr that could, in principle, be armed, but must not be procured with the corresponding weapon systems. This effectively means that for the foreseeable future, the Bundeswehr will continue to only use drones for reconnaissance purposes.
One of the main motivations for using remotely controlled drones is to protect military personnel, since many of these kinds of reconnaissance operations tend to pose a high risk to troops. Remote-control drones can also be used for extended periods of time compared to conventional aircraft or helicopters, which have to return to base after a relatively short time to top up with fuel.
Given the increasingly complex nature of military operations and crisis areas, underscored by the strong desire to protect military forces, particularly among European countries, doing away with all of the different types of drones would now be unimaginable for military forces in the West.
ONLY TO BE USED FOR ETHICALLY JUSTIFIABLE REASONS
Two potential risks are usually highlighted in the context of drones. One argument that is often heard is that the physical distance between drone pilots and their potential targets reduces emotional stress, essentially making weapon deployment, perhaps even as an act of war, easier and more likely. On the other hand, it is known that using drones to deploy weapons makes it possible to capture much closer images and thus gain a direct impression of a target compared to simply flying past in an aircraft or working from an artillery position.
As a result, drone pilots can also be placed under extreme emotional stress in such situations. In democratic countries, using weapons in this way always requires parliamentary approval. This would appear to make it unlikely that just having the capability to use drones for military operations will reduce inhibitions (although this concept does not apply to deployment by the secret services, who sometimes use drones instead of military personnel).
On the other hand, people sometimes suspect that drones are just a precursor for killer robots – they are just paving the way for the kind of autonomous weapon systems you see in Hollywood blockbusters like Terminator. Currently, neither “strong AI” nor “actual autonomy” (the ability to make independent decisions) will be achievable in technological terms for the foreseeable future. Similarly, handing over control to machines would be an undesirable situation for both military and political leaders.
Accordingly, all military forces in the West, including the Bundeswehr, currently emphasize that all military action must be subject to human intervention and thus any decisions made must be ethically justifiable and legally verifiable. It is inevitable that military and technological developments will result in certain systems being made automatic, simply because of the accelerating pace of the battle field. This is especially likely to happen with defense weapons, but it is unlikely that targets will be selected by machines in the foreseeable future – so we are unlikely to see autonomous killing.
“INFORMATION CAN AT LEAST HELP GIVE FUTURE DEVELOPMENTS A CHANCE”
An interview with Dr. Olaf Theiler
Hello Dr. Theiler. Why do you believe it is important to keep society informed about future technologies?
Emerging technology often comes with a hype, so you get these exaggerated expectations and fears coming in through the media, and this usually has a negative impact on expectations among the general population – even before a technology has had a chance to prove itself in use. What then follows is rapid over-regulation and this hampers meaningful development in the long term, in society as a whole, because of short-term and fragmented resistance and the fear of losing control. Information, or to be more accurate actual dialog, about emerging technology can at least give future developments a chance.
What kinds of concerns do you encounter in your work regarding emerging technology?
Basically when it comes to the Bundeswehr and military security policy, the fear that something could be misused becomes the benchmark for judging a new technology. It’s mainly an emotional reaction, and it often prevents people from thinking about an issue rationally or considering the real strengths and weaknesses of a promising technology.
But what’s worse is when some vague possibility – that one of these technologies might be used for unethical reasons – actually prevents us from stopping others misusing technology, even through perfectly legal means, for example because a certain issue becomes a political no-go.
Where and how would you personally use autonomous aircraft, and/or in which situations would you not use them?
Neither strong AI nor actual autonomy will be with us in the foreseeable future. So inevitably, the application areas for partly autonomous aircraft will remain limited. They can be programmed to fly a route from A to B, fly the route independently, and avoid potential obstacles on the way. There are myriad ways to use this in civil applications, even if it can’t give us answers to the problems of mass transportation. In military applications, you can’t assume that any objects you need to avoid will be passive; you have to assume there’ll be an enemy trying to actively disrupt your flight or even completely down the flying object. Under such circumstances, limited autonomy can only be used in very narrowly defined situations. Transportation in the hinterland, away from the front line, is conceivable, or peacetime operations at home, or maybe even transporting wounded personnel back from operations.
But getting actively involved in combat – using weaponry autonomously, with no human control – would not only be unjustifiable in ethical terms, it would hardly be of any use in military terms, partly because such systems are much too easy to disable or target.
Contact
Dr. Olaf Theiler (author)
Section Head
Zukunftsanalyse Planungsamt der Bundeswehr (Berlin)