It would be nice if we always knew the morally right thing to do, if our choices and commitments were painted in stark black and white.
One thing I do for Philosophy Talk as a member of the Crack Research Team is pre-interview each guest. The point is to give the guest an idea of the structure of the upcoming show—the “story arc”—and to make notes on what the guest thinks to pass on to John and Ken. Last week I spoke with Walter Sinnott-Armstrong about moral dilemmas. At the end of our conversation he said something that dovetails with thoughts I’ve had concerning the area I specialize in: philosophy of mind on self-deception.
Sinnott-Armstrong said that the person who faces a moral dilemma has an obligation to compensate, or minimize damage, on whichever side of the dilemma she ultimately breaks the moral requirement. In other words, if you have a moral dilemma, you’ll necessarily break one requirement or another, but since it’s a moral requirement you’re breaking, you should try to do as little damage as possible. The Ethiopian mother who must leave one child behind on the trip to the aid station, lacking strength to carry both, should tell the child she leaves that she loves him and is sorry. And Sartre’s student, if he joins the French resistance, should ensure that his mother is as well cared for as possible. This is Sinnott-Armstrong’s point; I’ll take it as given in what follows.
Self-deception is a state of believing that humans enter into as a result of desiring. It’s motivated irrationality. The abused wife in denial, for example, wants it to be true that her husband won’t beat her again, and this desire engenders the belief that he won’t. She’s not unintelligent; she’s self-deceived. Likewise, the college dropout clings fiercely to the belief that finishing his education isn’t necessary for having good employment prospects. He wants that to be true; that wanting causes a self-deceptive breakdown of his better standards of judgment.
The two examples I just gave suggest that self-deception is to be avoided. There are many, however, who would resist this conclusion. One prominent ethicist, whose name I won’t mention, speculated once in conversation, “Maybe it’s a good thing we deceive ourselves.”
Here I want to push the view that self-deception has morally negative consequences. I’m going against a line of reasoning to the contrary that relates specifically to moral dilemmas. One might say: “Well, in a moral dilemma you’re bound to do at least one bad thing, since you can’t meet both moral requirements. Since that’s inescapable, maybe it’s good to be self-deceived about the moral obligation you’re breaking. That would alleviate the psychological pain associated with breaking that moral requirement.” I think people are tempted by this kind of thinking often; that’s one reason why we’re less on guard against self-deception than we might be.
But that line of reasoning is dead wrong; Sinnott-Armstrong’s point shows us why. Let’s put aside the question of whether self-deception actually does minimize psychological pain. (I think it doesn’t, since it prolongs the healing process.) The problem with being self-deceived in the context of a moral dilemma is that you’ll be blind to your obligation to compensate and minimize damage. If you’re blind to it, you probably won’t do it. That’s bad.
Here’s the rub. There’s growing support in the philosophical community for the view that self-deception is not intentional—at least not most of the time. We slide into it, as opposed to deciding consciously to do it. But that means we can’t simply decide not to do it either. At best we can make good faith efforts to be the kind of reflective people who rethink the evidence and try to avoid bias of any form. In short, we can’t turn self-deception on and off like a switch. We have to make a higher-order decision about what kind of cognizers to be in general: ones who let self-deception pass or ones who guard against it.
You could argue that there are specific cases where self-deception turns out to be a good thing. I’m skeptical. But just remember that you have to make a choice about whether to have the kind of mind that’s prone to self-deception or the kind of mind that isn’t. Given the obligation to compensate in the context of moral dilemmas, I think it’s better to have the kind of mind that isn’t. That will take epistemic courage. But that's no surprise; being moral usually requires courage.