10The Morality Of Using Robot Warriors
Bradley J. Strawser of the US Naval Postgraduate School argues that the US has the right and the moral obligation to use drones in battle. He believes that they’re a “moral improvement” on other types of air combat for two reasons. First, drones reduce civilian casualties by striking their targets more precisely instead of blowing up everything in the area. Second, drone pilots are better protected from harm than those who fly conventional planes. To make his point, he cites World War II, where he feels it would have been morally preferable for the Allies to use drones rather than risk having their pilots shot down by the Nazis. Strawser says that we need to separate the morality of using robot warriors from US military policy. If the mission is morally justified, then he feels drones should be used. However, if you don’t like US policy, then he says you should object to the policy, not the means of executing it. For comparison, Strawser gives two examples of weapons that should never be used even for just causes. One is land mines, because they can’t distinguish between a soldier and a child. The second is lethal autonomous robots (LARs), which kill without human input. LARs do not currently exist. Of course, if you believe that war is never justified, then he understands that you won’t accept his argument. But Strawser believes that some wars are justified, and in those cases, drones are morally superior weapons for air combat.
9The Immorality Of Using Robot Warriors
Murtaza Hussain, a journalist with The Intercept, considers Bradley Strawser’s arguments about the morality of using drones to be propaganda. Hussain knows that we’ve advanced from swords and muskets to technology that kills more effectively, from greater distances, and with reduced risk to personnel. But he thinks that low-cost drones that avoid pilot casualties are a game changer because they shield war from public scrutiny. Hussain contends that US citizens don’t even realize their government has been at war with Somalia, Yemen, and certain areas of Pakistan. He feels that if the US public had to deal with an expensive war with mounting US casualties, they’d pressure the US government to end needless conflicts. Hussain also argues that drones don’t allow combatants to surrender even if the drone pilot can see someone putting his hands up. There are only two options for a drone pilot: kill or let the combatant go. There can be no capture. However, Hussain admits that the surrender provision of the Geneva Convention is probably outdated because other modern forms of warfare, such as manned aircraft, also don’t permit enemy surrender. Finally, Hussain denounces “signature strikes,” where drones hit human targets whose identities are unknown but who appear to have a link to enemy activity. These combatants have been defined by the Obama administration as “all military-age males in a strike zone.” Hussain calls this a “kill first, ask questions later” policy of execution without trial.
8Death For Sale Or Barter
We’ve talked elsewhere about the “Special Operations Forces Exhibition and Conference,” where military men shop for sophisticated weapons, including drones, in a casual and party-like atmosphere. If you have the money, you get the weapon. No one may know precisely how another person intends to use the weapons they buy, but everyone realizes that these weapons will probably be used to kill and may result in civilian casualties. But what about sovereign leaders who do know how specific weapons will be used? These leaders know that the weapons may result in civilian casualties in their own countries, and yet they see those deaths as an acceptable trade-off. Mark Mazzetti, in his book The Way of the Knife, describes a deal between the US and Pakistani governments in 2004: In exchange for permission to fly drones in Pakistani airspace to hunt US enemies, the US first had to kill a Pakistani enemy, Nek Mohammed Wazir. Under this agreement, the US received limited access to Pakistani airspace, so long as they made sure they didn’t fly over Pakistan’s nuclear facilities or other areas where the Pakistani government didn’t want them to go. The deal supposedly included an agreement that the US would occasionally strike Pakistan’s enemies, but “would never acknowledge the missile strikes and that Pakistan would either take credit for the individual killings or remain silent.” One argument supporting this exchange says that Mohammed may have been an “Al-Qaeda facilitator” and so was a US enemy, as well as a Pakistani one. Another argument says that drone strikes were better than the Pakistani government’s bloody ground battles to eliminate Mohammed, which caused far more civilian casualties.
7Misleading Language And Statistics
Mary Wareham, coordinator of the Campaign to Stop Killer Robots, says, “We put ‘killer robots‘ in the title of our report to be provocative and get attention. It’s shameless campaigning and advocacy.” Tom Malinowski of Human Rights Watch admits to using the same type of sensational language for similar reasons. The pro-drone side, with members such as DARPA, goes to the other extreme. If they don’t want to deal with an ethical issue, they just don’t talk about it. One example is the future possibility of robot warriors acting on their own without human commanders. P.W. Singer, author of Wired for War, calls it “The-Issue-That-Must-Not-Be-Discussed.” Then there’s the possible misuse of statistics, if you can get them. On the anti-drone side, researchers James Cavallaro, Stephan Sonnenberg, and Sarah Knuckey say their studies in Pakistan show that drones have caused serious harm to civilians in terms of injuries, deaths, and the constant fear of death by drone at any time. These critics believe the lack of detailed information about drone strikes from the American and Pakistani governments is an attempt to avoid accountability in a public forum. On the pro-drone side, C. Christine Fair of Georgetown University claims that forensic experts haven’t verified the numbers and types of injuries caused by drones, especially in Pakistani tribal areas. She believes that some injuries and deaths may have been caused by terrorist attacks or Pakistan’s own military operations. With little information from the American or Pakistani governments, she says we can’t know what the real death toll is or whether the targets were “drone worthy.” She feels that drones are the best option in certain Pakistani tribal regions because ground troops would be more deadly and could uproot or destroy entire communities.
6Humanitarian Military Interventions
This theory is the brainchild of bioethicists Zack Beauchamp and Julian Savulescu. They reverse one of the arguments against robot warriors—that they make war more likely by making it too easy—and suggest that easier wars are a good thing in certain circumstances. Beauchamp and Savulescu claim that easier warfare encourages us to engage in “just wars” like humanitarian military interventions that we would have previously avoided. An example would be the genocide in Rwanda where the US was criticized for not intervening when Hutu extremists decided to eliminate the Tutsis. Critics believed that the US didn’t want to get its troops mired in another ground conflict like the earlier humanitarian effort in Somalia. By minimizing or even eliminating military casualties for the humanitarian forces, robot warriors make it easier from a political and financial standpoint to stop genocides and other mass atrocities.
5Emotional Attachment To Robots
University of Washington researcher Julie Carpenter studied how highly trained military personnel feel about the robots with whom they work. Together, soldiers and robots detect, inspect, and disarm explosives. She wondered if soldiers might develop attachments to robots as though they were pets. If so, how would soldiers react if their robots were damaged or destroyed? Would those feelings change how soldiers used robots in the battlefield? In Carpenter’s study, the soldiers seemed able to make the needed decisions despite sometimes feeling anger, frustration, and sadness when a robot was destroyed. But these military men named their robots—after celebrities, girlfriends, and wives (though never exes). “They were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet,” Carpenter observed. “They would say they were angry when a robot became disabled because it is an important tool, but then they would add ‘poor little guy,’ or they’d say they had a funeral for it.” These robots looked like machines. But with future robots designed to look more like animals or people, Carpenter wonders if soldiers will feel affection for robots like they would for pets or human partners. Will their feelings affect their ability to make rational decisions? Could we see an outpouring of emotion and changes in laws such as happened with a German shepherd K-9 officer, Rocco, that was stabbed in the line of duty in Pennsylvania in 2014? About 1,200 people attended the dog’s funeral and Pennsylvania passed “Rocco’s Law” to increase the penalties for injuring or killing a police dog or horse.
4The Effects On Survivors In Strike Zones
We spend a lot of time arguing over body counts while rarely considering the experiences of those left alive in countries where drones strike. Post-traumatic stress disorder (PTSD) is widespread among survivors, especially children. Among other things, the survivors may experience anxiety, insomnia, anger, and paranoia. Clear blue skies, when drones can fly overhead, are often a trigger for these symptoms. “I no longer love blue skies,” said 12-year-old Zubair, whose grandmother was killed by a drone. “I used to play outside all the time. There was hardly ever a time I would be indoors. Now I’m afraid.” When former New York Times reporter David Rohde was held for months by the Taliban in Pakistan, he said the drones scared his captors as well as civilians. As Rohde described this “hell on earth” experience, “The drones were terrifying. From the ground, it is impossible to determine who or what they are tracking as they circle overhead. The buzz of a distant propeller is a constant reminder of imminent death.” Farea Al-Muslimi, a prominent pro-American democracy activist from Wessab, Yemen openly hates Al-Qaeda. He says that Yemenis use their anger over drones as a way of venting about other issues like unemployment and lack of health care. But he admits that people want revenge for relatives killed by drones. Drone strikes “are the face of America to many Yemenis,” says Al-Muslimi. “If America is providing economic, social, and humanitarian assistance to Yemen, the vast majority of the Yemeni people know nothing about it. Everyone in Yemen, however, knows about America and its drones.” He used to tell people from his village stories of the opportunities in America and his friendships with its citizens. But now he fears that he can’t return to Wessab because of his close association with the US. He believes that Al-Qaeda is benefiting from the killing of innocent civilians by drone strikes.
3The Effects On Drone Pilots
Some drone pilots experience PTSD just like their counterparts in direct combat. “People think we’re sitting here with joysticks playing a video game, but that’s simply not true,” said Slim, a retired Air Force drone pilot. “These are real situations and real-life weapons systems. Once you launch a weapon, you can’t hit a replay button to bring people back to life.” According to Nancy Cooke of Arizona State University, drone pilots may be more affected by killing than fighter pilots because drone pilots monitor the situation afterward and see the gruesome results of their attacks. Fighter pilots face the danger of being shot down, so they leave the scene quickly. For drone operators, psychologists are beginning to understand that PTSD is about “moral injury,” their feelings about what they’ve done to other people. As Brandon Bryant, a former drone camera operator, describes it, “It’s more like I’ve had a soul-crushing experience. I was never prepared to take a life.” Bryant was involved directly or indirectly in the deaths of over 1,600 people. One of his most haunting experiences was when his drone fired a missile that struck a compound supposedly housing only cows and goats. Right before impact, he saw a small figure like a child run around a corner into the path of their shot. It was too late to abort. There was a momentary flash, and the small figure was gone. An intelligence observer told them, “Per the review, it’s a dog.” But they knew it wasn’t a dog. As former drone intelligence analyst Heather Linebaugh explains, “What the public needs to understand is that the video provided by a drone is not usually clear enough to detect someone carrying a weapon, even on a crystal-clear day with perfect light. The feed is so pixelated, what if it’s a shovel, and not a weapon? We always wonder if we killed the right people, if we destroyed an innocent civilian’s life all because of a bad image or angle.” She goes on to say that she watched people die again and again in great detail until it became seared in her mind, haunting her. According to Linebaugh, the trauma comes not just from the memories but the guilt of wondering whether she killed innocent people.
2Advanced Robots Of The Future
According to Bill Gates, the robotics industry today is where the computer industry was 30 years ago. We’ll soon have to deal with ethical and legal questions about robot warriors that can act without human input. Linda Johansson of the KTH Royal Institute of Technology believes that we’ll have to revise the international laws of war. If soldiers are permitted to kill enemy soldiers in wartime, will it be legal to attack a drone operator in a distant location? Under the laws of war, is a robot warrior legally permitted to defend itself if it can’t lose its life? Taking that a step further, if a robot warrior can’t lose its life, does it ever have the right to kill human soldiers? Roboticist Noel Sharkey also worries that drones will soon be able to fly so fast that a human will have no time to abort an attack. That’s partly why Ronald Arkin of Georgia Tech wants to develop an “ethical governor” now for future advanced robot warriors. Others just want an outright ban. But Arkin believes that autonomous robots are inevitable. “Someone has to take responsibility for making sure that these systems . . . work properly. I am not like my critics, who throw up their arms and cry, ‘Frankenstein! Frankenstein!’ ” Meghan Neal argues that so-called killer robots will probably act more ethically than human soldiers because they won’t be driven by anger, fear, or revenge. But this same lack of emotion would keep a robot warrior from questioning an order to commit a war crime. Even so, Filip Spagnoli believes it’s easier to correct the programming in robots than to erase the prejudice or emotional shortcomings in humans.
1The Choice To Ban Or Regulate
For now, the US is the focus of criticism, but other countries may find themselves in similar positions in the future. The people who want to ban drones feel that these machines won’t comply with the laws of war. Some people also argue that we won’t know whom to hold accountable if a machine makes a wrong decision: the robot, the military commander who deployed it, or the designer? Tom Simpson of the University of Oxford proposes that we follow the regulatory model used for medications. His argument is that we can predict the normal effects of medicine, but in a small percentage of cases, there’s an unpleasant or even lethal side effect. We don’t usually hold doctors or drug companies (and certainly not the medications themselves) responsible for side effects. Instead, regulatory bodies test medications before allowing them to be used by the general public. In the same way, Simpson believes that we should have a regulatory agency test a robot’s ability to comply with the laws of war before giving it clearance to be deployed. Two law professors, Kenneth Anderson and Matthew Waxman, also believe we should regulate robot warriors sooner rather than later. They contend that robot warriors will advance incrementally so that humans will fade from the command loop slowly. At a certain point, it will be difficult to separate what’s legal and what’s prohibited in a machine. Another option, suggested by Omar S. Bashir of Princeton, is a review process like the UK and Australia use. One individual with the needed legal expertise and moral authority would be appointed to review all information (including classified documents) relating to military robot programs. This person could review videos of disputed attacks and confirm body counts of civilians killed in drone strikes without revealing classified information. In the US, Bashir believes this would make the government’s drone program more acceptable to its own citizens and foreign allies.