We like to think of ourselves as self-aware, reflective beings, but psychological studies demonstrate that we’re usually overconfident in the accuracy of our own beliefs.
This week we're thinking about Second-Guessing. Some people avoid second-guessing themselves on principle. It’s like Ser Alliser says in Game of Thrones: “Leadership is all about getting second-guessed by every clever little twat with a mouth. But if a leader starts second-guessing [himself], that's it. That's the end."
But even leaders sometimes make rash judgments and uninformed choices. It's not hard to argue that we would have all been better off if Bush and company had done a little second-guessing of themselves when it came to the decision to invade Iraq. Of course, who’s to say they would have gotten it more right the second time around. Besides, if you start down that path, you may soon be second-guessing your second guesses. And who knows where that could lead.
In any event, I’m not suggesting that you should never stick to your beliefs and decisions. But that doesn’t mean you should always stick to them either. It’s not an all or nothing thing. Wouldn't it be great if there were some nice, neat formula, for deciding when it’s a good thing to second-guess yourself? I doubt there could be such a formula, but of course maybe I should reconsider those doubts. (Ha!)
Take the following example. Suppose you catch a glimpse of a person in the distance at a restaurant and think might be me. Suppose you're pretty sure, but not 100 percent certain. Maybe you're 70 percent sure, just to give it a number. Later on I tell you that it wasn’t me that you saw. Should you just stick to your original belief? Isn’t it reasonable for you to reduce your confidence -- maybe all the way down to zero -- that it was me you actually saw?
Of course, for all you know I could be lying to you. Maybe you saw me at some very intimate dinner, staring longingly into the eyes of a very attractive woman, who didn't look all that much like my wife, so I'm just trying to distract you. My point is that sometimes our confidence in our beliefs can be less than complete. When we get more evidence, we should be willing to change our degree of belief -- up or down – in light of new evidence. If I told you, for example, that it really was me that you saw, wouldn’t your confidence shoot all the way up to 100 percent?
But now we’re not really talking about second-guessing ourselves, we’re talking about reconsidering old beliefs in light of new evidence pro or con. That's just what rational people do. What rational people don’t do and shouldn't do is go around second-guessing themselves on the basis of no real evidence whatsoever. Once the evidence is in, and you’ve made up your mind, stick to your guns. Don’t go changing your mind just because you think you might possibly be wrong.
But what if the belief that you might possibly be wrong is itself based on evidence -- not on evidence about whether I was in that restaurant or not… but on evidence about your own degree of reliability? For example, suppose you found out that you glasses had been tinkered with and that they now had a tendency to distort your vision -- a little far-fetched, maybe, but it makes the point. If you found out that your vision was being distorted, wouldn’t it make sense to change your degree of confidence in your belief that it was me you saw – even though you got no new direct evidence as to whether I was there or not?
Of course people are almost always subject to some sort of distorting influences when they draw conclusions or make decisions. If we let our confidence be undermined by that fact alone, it leads to self-doubt, skepticism, and inaction. Does that just paint a stubborn refusal to reconsider as an epistemic virtue? Or does praise a spineless indecisiveness as one? Tune in to find out what our guest, Sherri Roush (now at King's College London), author of Tracking Truth: Knowledge, Evidence, and Science, thinks about how best to go about evaluating our self-assements.