moral dilemmas, philosophy

Moral Dilemma (26): Killer Robots

killer robots

(source)

If you haven’t been following the debate: “killer robots” are autonomous machine combatants which are currently being developed by a number of countries. These machines are different from unmanned ground vehicles and drones. The important difference is not that UGVs and drones are already in use but that they are controlled by human operators, much like any conventional weapon before them, albeit from a larger distance. Killer robots, on the other hand, will act autonomously and will be able to choose and fire on targets of their own will, so to speak, without any human intervention. The only element of human control will be their programming. (Some of these killer robots already exist, apparently). It would be wrong to think that they’ll look like The Terminator but they will be somewhat similar in their mode of operation.

I personally don’t know what to think about this. Most in the human rights community are adamantly opposed, so my priors should push me in the same direction. However, I can see certain benefits, which is why I believe that we’re dealing here with a moral dilemma. Let me try to explain by listing some of the arguments against killer robots as well as some of the reasons why these arguments aren’t really as good as they sound.

  1. We shouldn’t give machines the power to decide who lives and dies on the battlefield. But is that really what this is about? The machines would still execute a program that has been written by humans, and it’s these humans who, through the act of programming, decide who should be killed or spared. For instance, it should be possible for robots to recognize civilians – or at least to refrain from acting when there’s reasonable doubt about a person’s military standing. However, one might reply that robots can’t be programmed in such a way that they acquire a subtle understanding of all the different and complex circumstances in which they’ll find themselves. They can only apply general rules in a manner that is more or less blind, i.e. that takes into account only a limited set of foreseeable circumstances. They’ll never fully understand the contexts in which they act, and this lack of understanding is bound to cause civilian harm. But then again, can we rely on soldiers in the heat of battle to fully understand the circumstances in which they find themselves? I think not. If anything, a very selective robot should do better.
  2. Robots lack compassion. Maybe removing the emotional element from combat will turn out to be a net benefit. Human fighters can indeed show compassion and spare the lives of the innocent – or even the guilty. But is it not more common for human fighters to be led astray by their emotions? Stress and fatigue may lead to a loss of control. In-group bias and other prejudices may get the upper hand in the thick of a fight. Robots are obviously immune to stress and fatigue, and it should be possible to program them in such a way that they don’t act on the basis of biases. Even the need of self-defense can cause human soldiers to “spray the lot” in order to get out alive. A robot won’t have that instinct.
  3. Killer robots can be given immoral orders by an immoral chain of command. It’s true that robots can be programmed to kill indiscriminately or to kill all brown people. But history is full of human commanders giving exactly the same kind of orders. If robots are programmed in immoral ways, then that’s an easier problem to solve than the prejudices or emotional failures of scores of individual soldiers and commanders. Of course we’ll have to monitor the people who will program the robots. But is this more difficult than monitoring the immoral orders by human leaders? Obviously it’s not. It’s true that monitoring will be easier in democracies, but if dictators want killer robots there’s not a lot we can do to stop them or to convince them to use robots in a ethical manner.
  4. If one side uses killer robots, the other has to as well. No one can risk a disadvantage in the conduct of war. A robotic arms race will be the result. In a sense, this is nothing new. Warring parties have always wanted new weapons and an advantage over the enemy. That’s what war is about, and has always been about. To the extent that an arms race is deplorable, killer robots aren’t more deplorable than any other new weapon. If the new robotic arms race results in all sides only using robots, then that would even be a net benefit because human soldiers will no longer be needed. We often tend to focus on the harm to civilians in the conduct of war, whereas soldiers are also human beings, and in many cases human beings who haven’t chosen to fight.
  5. With killer robots, there’s less risk, less skin in the game. We’re not even seeing what happens. The result will be more war. This is probably the strongest argument, but again not one directed solely against killer robots. Drones have the same effect.

Given these considerations, in addition to many others I haven’t mentioned,

There’s an interesting podcast on the topic here. If you want to add your voice to previous moral dilemmas, you can do so here.

Standard

2 thoughts on “Moral Dilemma (26): Killer Robots

  1. robotman says:

    This all sounds very logical but unfortunately is based on false premises about what programming is. These programs have to adapt to unanticipated actions of an adaptive enemy and are therefore unpredictable.

    But worse, the commentary here is anthropomorphic. We are talking about weapons here that are launched by the same humans who can seek revenge or be compassionate. Do you think that the robots can think or have any kind of agency?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s