Milgram

Resistance is utile

Radiolab produces such high-quality shows that I just assume everyone listens to it or downloads the podcast. If you don't, I'd like to nominate it as an achievable new year's resolution. 

They've been revisiting some of their greatest hits, and I highly recommend this rebroadcast of "The Bad Show." One of the pieces delves into the Milgram experiments in compliance, in which unwitting participants were persuaded to deliver potentially lethal shocks to what they thought were real experimental subjects. An astonishing percentage allowed themselves to be pushed into an apparently immoral act at the urging of a white-coated researcher. Their compliance is often touted as proof that human beings are easily swayed to evil. 

The section about the experiments starts at 13:00. What makes Radiolab great is that they don't just accept the standard interpretation of the results. They interview professor Alex Haslam, who thinks we've totally misjudged Milgram's work. 

Milgram ran many additional experiments to see what conditions would reduce the chances that the participants would deliver the shock. When they could see the subject, fewer people delivered the shock. When they had to hold the subject's hand to the shock plate, even fewer.

You know what reduced the compliance rate to zero? When the white-coated experimenter told the participant "You have no other choice" but to continue.

No one who was told they had no choice went through with the shocks. 

What a powerful thing it is, to be reminded of the fact that we have a choice in our actions. It's a form of mindfulness, one that is easy to lose as you walk in a beseeching world, cajoled by the endless figures who benefit by your compliance. Even between the sparring voices you hear when alone in the quiet, you have a choice. That's useful information in a season in which many of us consider how we might strive to be a little better. 

Not health but healing

The fall movie I've been anticipating the most isn't the film about a man stranded across the gulf of space, but the one about the gulf between the moral people we wish we were and the reality of human behavior. That's a terrible sentence, but I'm leaving it there because it illustrates the gulf between the writer I'd like to be and the one I am. 

I'm talking about Experimenter, a film about the research of Stanley Milgram, the social scientist famous for inducing his test subjects to administer severe electrical shocks (they thought) to a human subject, merely by exerting some mild social pressure from an authority figure. If you haven't seen the actual films of the experiments, they are pretty amazing. 

Most people, when told about this experiment, assert that they would be among the few who resisted pressure to continue an increasingly inhumane and dangerous experiment, and did the right thing by stopping. Most of those people are wrong. Their (our) blindness to our own moral pliability makes us more likely to be manipulated into immoral acts. In You're Not as Virtuous as You Think, Nitin Nohria argues for "moral humility," in which we recognize our susceptibility to moral influence as a first step to becoming more decent human beings.