Last week, Dr. Robert J. Lifton, the esteemed psychiatrist and author of many important books, such as The Nazi Doctors, Hiroshima in America and Destroying the World to Save It, published a long, brilliant essay about the moral ramifications of drones. Called “Ten Reflections on Drones” (published in Huffington Post, April 11), Lifton argues that we better start grappling with the effects of drones in our lives, and get rid of them before they take us to entirely new levels of psychic numbing and global violence.
“I seek to begin a conversation about our relationship as human beings to these robotic objects as weapons,” Lifton writes.
Like many, I too have been long pondering the omnipresence of drones, the Obama administration’s criminal commitment to drones, their regular killing of children, and their subtle effect on all our lives. Since my arrest at the Creech Air Force Base, the national drone headquarters, and my recent trip to Afghanistan, where I heard many stories of relatives who lost loved ones from our drones, I’m convinced these drones are destroying us too–spiritually. As someone once said, “Those who live by the sword, will die by the sword.”
I think Lifton’s ten points about drones should be studied by all those who care for peace. Here is a shortened version of his ten “meditations” on drones and the illusion that they “work”:
1.The [illusory] lure of an intelligent, nonhuman killing machines. “We can give the job of killing to an advanced technological entity, a compelling robotic instrument entirely devoid of feelings, and thereby suppress our own feelings in relation to that killing. This extreme psychic numbing enables us to kill while distancing ourselves from the significance, the meaning, of that killing.”
2.The illusion that we can fight wars without our own people, our soldiers, dying. “The task of convincing the American public of wars’ purposes seems ever more difficult, but still more difficult is the rejection of such meaning, leaving Americans with the unacceptable thought that our soldiers actually did die in vain. No wonder that robotic warfare is embraced as a panacea, an irresistible gift, to those in authority faced with the increasingly troublesome task of establishing meaning for the deaths of Americans. In contrast, it matters little whether or not a destroyed drone died in vain.”
3.Another illusion is that of the drones’ capacity for what is called “targeted” or surgical killing, meaning the dispatching of a particular person and no one else. Targeted killings mean children die. Lifton cites the Stanford Law School report that lists the many civilians killed by U.S. drones in Pakistan, including scores of children. They were not “militants” as the U.S. government claims. “Targeted killing turns out to be an illusion of absolute technological precision.”
4. A related illusion is that of “humane killing.” “The concept of humane killing is an oxymoron. I would go further and see it as a psychological and political legitimation of killing. The argument of humane killing is based on the false claim of pinpoint targeting resulting in quick painless death. But when interviewers from the Stanford project approached Pakistani witnesses and family members of victims, the story was quite different: multiple people killed, often an entire gathering while sitting together, and grotesque effects such as dismembered corpses and scattered body parts. Ostensibly humane methods are meant to ease the individual and collective conscience of those who carry out or support the killing. The claim of humanity can thus lead to more killing.”
5. Another illusion has to do with ownership of the drone technology. “As with nuclear weapons, we take on the sense that the know-how and the product permanently belong to us. Of course we are aware that other countries have drones — at last count 76 such countries — and that drones, unlike nuclear weapons, are a relatively easy technology to acquire (not only for nation-states but for non-state groups and terrorist organizations). But there is nonetheless a kind of assumption that we will always dominate and control this technology.What does it mean, psychologically and historically, to have such a proprietary attitude toward a technology, especially a technology associated with killing? It has something to do with American exceptionalism, but also with the powerful technological component of the superpower syndrome, the collective fantasy of permanent omnipotence.”
6. Another illusory stance, also associated with a static view of history, is that of ignoring highly negative responses or blowback. “97 percent of Pakistanis oppose our drone policy, as do high percentages of people in other Middle Eastern countries. There is a dynamic of anger and rage, undoubtedly leading to the recruitment of many new anti-American terrorists, all of which could more than offset the ostensible gains in national security from the killing of a few al Qaeda leaders.
“There are certain factors that almost guarantee violent blowback. These include the precariousness of our relationships in the areas in which we have been using drones and the information revolution in which responses in general, and angry ones in particular, are given worldwide dissemination almost instantaneously. An additional source of rage and blowback is the particular kind of humiliation drone attacks bring about. Those on the ground are helpless before this mysterious entity in the sky. Pakistani tribesmen, devoid of the technology or any understanding of it, could neither fire at the object nor throw rocks at it. And there is no emotion more likely to result in violence than that of humiliation. Whether we are talking about individual, group, or national behavior, humiliation is a key to anger, rage, and violence. In our continuing studies of humiliation we need to take account of the impact of higher technologies of killing on people lacking those technologies and the quick and strong motivation in such people to respond in kind.”
7. The illusion of a “rescue technology” that can turn around a failed policy. “Drones have become a cure for the disarray and defeat associated with our doctrine of counterinsurgency warfare.Drones offer an alternative that seems to be free of either of these problems: no ground troops and therefore no civilian atrocities. But in actuality drones not only kill civilians but themselves add new dimensions of atrocity.
8. Drones raise new questions about the collusion of professionals in killing. “In my work on Nazi doctors I spoke of the combination of professional killers and killing professionals. The latter can consist of lawyers, psychiatrists and psychologists and physicians in general, economists and corporate leaders, to which we must now add various kinds of computer specialists. With drones there is a diminishing distinction between professional killers and killing professionals. Those who operate robotic devices thousands of miles from the killing would seem to fall into both categories. They are in fact a new breed of educated and technologically trained professional killers. We need to look more closely at ethical questions concerning how various groups of professionals are making use of their knowledge and energies on behalf of robotic devices that kill.”
9. The inevitable problem of fallibility. “All technologies have moments of failure, and all are subject to both human error and the extremities of nature. Drones’ killing the wrong people could also be considered a significant failure — and this can occur for reasons of bad intelligence or of miscalculation by operators. What happens when a weapon embraced as a panacea turns out to be a source of grave error?
10. The ultimate issue of human and nonhuman agency. “There are accounts of varying degrees of loss of human control over the drones. And there are envisioned more and more occurrences in which action would be so rapid as to allow no time for human intervention and the drones would have to make ‘decisions’ on their own. There is even the imagined possibility that drones could turn around and attack either their operators or troops or civilians with whom the operators are associated. It is partly the Frankenstein narrative: the monsters we create can neither be controlled nor got rid of, and ultimately endanger their creators.My point is that we need to have something to say about our software, especially when it threatens to take war-making out of our own fragile enough hands and turn it over to our amoral, by no means always predictable, robotic delegates.”
In conclusion, Lifton writes: “These reflections are meant to be part of a psychological and historical conversation about ourselves in relation to a new technology of killing that is not only itself revolutionary but reveals much about our overall struggles with technology, agency, and control. By pursuing this conversation we might enable ourselves to take wiser steps toward managing and restraining our use of this technology.”
I could add other “meditations” to Lifton’s list: drones violate the ideal of nonviolence; drones mock the God of peace; drones betray the nonviolent Jesus’ commandments of peace; drones destroy a future of peace before we even envision it; drones breed fear; drones affect the environment; drones are too expensive; drones sare idols of death… We could go on.
Lifton is right: we must break through our psychic numbing, reclaim our humanity, pursure nonviolent conflict resolution and rid the planet of these drones. We have to keep organizing, resisting, and speaking out against this new killing technology and what it is doing to all of us.