In a recent article entitled The Moral Hazard of Drones in the New York Times John Kaag and Sarah Kreps raise a very critical question about our relationship with our technology. They state, "
As anthropologists (scientists and humanists), we should recognize from our archaeological and ethnographic archives the moral hazard that this conundrum presents to our species. Do we have an answer to it?As in the myth of Gyges, our use of drone warfare confuses our ability to kill without detection with the moral right to do so.
Human beings exist in a Superorganic environment created by our unique intelligence, our collective memories and the technologies these have produced that enable us to become the dominant species on this planet. For a million years our species and precursors have relied on technology to give us an edge in the evolutionary game of survival of the fittest. But over the past 500 years, our technology and the environment it creates for us has become as much a threat to humanity and its survival as it has been the tool by which the superorganic evolves.
If you question this assumption, think of the millions of human beings who died because the invention of navigational tools and science, and deep water sailing ships broke down the natural barriers between the old and new worlds. These brought the Europeans and their diseases to the new world and disseminated the peoples who lived here. Just as the rapid growth in international air travel today brings peoples from all parts of world and their products together in a one world system where humans competes with and use advanced technologies to survive.
One might argue that this is the old Luddite argument against the machine and technology. That it is the political argument that favors labor over technology, that pits labor against capital. But these are political arguments easily dismissed by partisan philosophies and ideologies.
The question Kaag and Kreps present is moral, not political, but the solution may require a political solution. It has to do with the question of survival of our humanity. If you study the HISTORY of the technology of weapons, you will find that going from the hand axe that one uses to beat the prey to death and move to today to the predator drone where you sit in air conditioned quarters half way around the world and destroy a truck (and kill the human occupant), you will find that we become more psychological desensitized or disengaged the further and further away we become from the physical act of killing.
To be human is to have empathy. There are only a few species that we know of with this capacity. There is only one species so physically generalized that it has adopted technology as it means to adapting to environmental change. The further removed our technology makes us from the consequences of our action, the less empathy we have for the victim. As the technology of killing becomes more sophisticated and efficient, the whole process has becomes more mechanical and dehumanizing.
The question is a practical, pragmatic one. When killing is a job, there is a lose of one's humanity, a lose of empathy. Terrorism is successful only as long as we put a human face on the deaths and suffering that the terrorist can produce. The whole of the United States military technology has not been able to stop or beat terrorist attacks using technologies like IEDs or using using a human bomb. Our greatest fear is that A TERRORIST will obtain and successfully deploy a WMV (technology) in a major populated setting. How do we control this and not lose our own humanity?
Where is the responsibility for the moral hazard that this technology imposes on our species?