Last week, President Bush agreed to legislation banning cruel, inhumane, and degrading treatment of prisoners in US custody. His U-turn ended months of debate about the ethics of torture. Now come revelations that the White House approved eavesdropping on American citizens inside the United States without court-ordered warrants.
These two information-gathering methods are markedly different: One inflicts pain while the other invades privacy. But each has credible national security arguments in its favor. Each raises profound human rights objections. And each threatens to compromise the nation’s moral authority abroad.
But there’s another issue, largely unexplored, that speaks to a deeper concern: the effect of such activity on the perpetrators. What is the impact on those we ask to carry out those actions?
Stanley Milgram’s famous obedience experiments at Yale in the 1960s shed light here. He recruited individuals to test (so they were told) the role of punishment to promote learning. They were asked to follow an experimenter’s orders and administer increasingly powerful electric shocks – up to 450 volts – whenever a “learner” gave the wrong answer.
Unknown to those giving the shocks, the experiment was a fake. The experimenter in a gray lab coat was a plant. The learner was an actor mimicking anguish. No shocks were ever sent. The point was to see how long the recruits would persist (though Mr. Milgram didn’t use these words) in torturing their victim in obedience to authority. The sobering answer: a very long time.
One of Milgram’s recruits, William Menold, had just been discharged from combat duty in the US Army. Growing increasingly concerned for the learner he was zapping, he complained to the experimenter, who told him to carry on and that he would accept full responsibility. Mr. Menold recalls that he then “completely lost it, my reasoning power,” and became fully obedient. Milgram’s biographer, Thomas Blass, notes that Menold “described himself as an ’emotional wreck’ and a ‘basket case’ during the experiment and after he left the lab, realizing ‘that somebody could get me to do that stuff.'”
Would Menold have been so emotionally battered if the experiment had involved wiretaps rather than shocks? Hardly. But the two activities are on the same scale, if at different ends. If anywhere along that scale our nation makes citizens “do that stuff” to others, are we dehumanizing those who do it? In taking advantage of undefended victims, are we degrading our own personal integrity?
Those aren’t idle questions. Personal integrity isn’t achieved through inoculation. It’s a process. Rooted in core ethical values, it shapes itself, decision by decision, across a lifetime. It depends on consistency, continuity, and repetition. Each lapse makes the next one easier.
If that’s true for individuals, it’s also true for organizations and nations. When an individual merges unthinkingly “into an organizational structure,” warned Milgram, “a new creature replaces autonomous man, unhindered by the limitations of individual morality, freed of humane inhibition, mindful only of the sanctions of authority.”
Government agencies, especially those defending the nation through espionage and military action, depend on personal integrity. Yet they create these “sanctions of authority.” They can even require unethical actions. When they do, however, they risk creating in the perpetrators either an anguished guilt or an amoral numbness. A convicted Watergate-related figure, Egil “Bud” Krogh, recalls what it was like to sacrifice conscience for what he saw as President Nixon’s unquestionable authority. Whenever you do something like that, he says poignantly, “a little bit of your soul slips through your fingers.”
That’s not what democracy is about. None of us wants our public servants turned into pliant emotional wrecks. And none of us wants the nation cast in the role of the gray-coated Grand Experimenter, calmly overriding individual ethics in the name of collective expediency. With the torture debate over for now, it’s time to begin the conversation on the broader differences between moral and immoral authority.
Rushworth M. Kidder is president of the Institute for Global Ethics in Camden, Maine, and the author of “Moral Courage.”