When we make decisions we think we're in control, making rational choices. But are we?
Our topic this week is the irrationality of human decision making. As philosophers, I’m sure that John and I would like to believe that we make decisions in a perfectly rational way. Indeed, I’m sure that most people think of themselves as pretty rational decision makers. How would thoroughly rational decision making go? Well, first, you’d decide what things you want, and how much you really want them. Second, you’d survey your options for getting what you want. Third, you would assess the upside benefits and downside costs of each alternative. And last but certainly not least, you’d choose the alternative that has either the greatest upside or the least downside, depending on whether you were risk-averse or risk seeking. It’s pretty simple really.
Decades of psychological research has shown, though, that although philosophers may be paragons of rationality -- ahem, ahen – in fact most people (and probably most philosophers too) are pretty irrational in their decision-making. People go wrong at every turn. We aren’t so good at figuring out what we want. Our preferences aren’t very stable or coherent. We’re bad at assessing risks and reward. You name it, when it comes to decision making, we’re bad at it.
Here’s a little game you can play with a partner that helps illustrate how irrational we can be. Let’s call it Sellers and Choosers. If you’re reading this alone and you want to play along, go get a partner now and let’s play the game together. I’ll be the referee. I’ve got two mugs – one for you, one for your partner. The mugs are exactly alike. I’m just going to flat out give you one of the mugs. (I can’t really do that over the internet just yet. But use your imagination and play along.) Anyway, the mug is yours to keep. It’s a really beautiful mug and very well made. Or, if you like, you can sell it. No doubt you’d be willing to sell the mug for the right price. So go ahead, write down the price at which you’d be willing to sell your lovely little mug.
Now as for your partner. I’m going to offer your partner a choice. I’m not going to flat out give her (or him) the identical mug. She or he has to choose. She has to choose between an identical mug and a sum of money. How much money, you ask? Well, I’ve written an amount of money on the bottom of the mug. She doesn’t get to see it. Instead what she has to do is write down an amount of money such that if she had a choice between the mug and the money, the choice between the two would be a wash. She gets the mug only if the price she writes down as a fair price for the mug is higher than price I’ve written on the bottom of the mug.
You may be wondering going with this and what it has to do with irrational decision-making. Don’t worry, the punch line is about to come. Here’s the thing, suppose we run this little experiment thousands of times and put people in different roles – sometimes the role of Seller and sometimes in the role of a Chooser. You know what we find? Well we find that people in the role of the seller place a significantly higher price – like more than twice the price -- on the mug than people in the role of the chooser do. What that means is that if the mug is already yours (and you have to set a sell price) you’ll think it’s worth a lot more than a similar mug that isn’t yet yours (on which you have to place a “willing to purchase it” price.)
One way to think of this is as an instance of loss aversion. You’ve got your precious mug in hand and you don’t want to lose it. It means a lot to you. And so you set a very high price on it. That is, people tend to value things they already have and might lose, much more highly than things they don’t have, but could get.
That seems pretty irrational, doesn’t it? Go back to what I was saying earlier about calculating upside benefits and downside costs. It looks like those calculations are highly skewed, depending on whether we’re talking about gains or losses. That doesn’t make any sense.
We’ve looked at just one tiny little example of apparent human irrationality. There are literally hundreds of experiments that demonstrate that people are massively irrational in the way we make decisions. And luckily for us, we’ve got one of the world’s leading investigators of human irrationality as our guest this week. Dan Ariely, author of the bestselling Predictably Irrational: The Hidden Forces That Shape Our Decisions.
By the way, Ariely has a follow up book out – The Upside of Irrationality: The Unexpected Benefits of Defying Logic at Work and at Home. We’d love to have him back on the show to talk about the new book. This week’s episode, though, is less about the upside of irrationality than the downside. But I think one can get a glimmer of how irrationality might have an upside by considering last week’s topic – loyalty. From pure self-centered cost-benefit analysis, it can be hard to make sense of loyalty. You might even call loyalty a form of irrationality. But without loyalty (and trust) all kinds of relationships wouldn’t be possible. So if loyalty is a form of irrationality, it may be a darned good thing that we are irrational in that way. But that’s a topic for another show.