You've probably heard about the dangerous effects of fake news, and the spread of sensational and targeted falsities. But what about "legitimate" news, one might still ask?
What makes people susceptible to fake news? This question is part of a larger psychological inquiry into how people process political information. And a wide-ranging debate has emerged with two camps.
First, there’s the Reason Pessimists camp. The Pessimists hold that, in general, people who are more prolific reasoners will show more political bias in processing information. Their idea is that everyone is biased, and people who engage in extensive thinking will use their intelligence and knowledge to justify even more skewed positions than they otherwise would hold. Reasoning, on this view, is mostly rationalization. I gave you a glimpse of Reason Pessimism, when I discussed Dan Kahan’s Cultural Cognition Thesis about climate change denial. Kahan claims that not only does greater knowledge not predict climate change acceptance among conservatives, it is actually negatively correlated with acceptance. And Kahan points to motivated reasoning to explain this: he thinks the deniers’ “reasoning” is guided by cultural motivations, so more “reasoning” means more distortion. Another example of Pessimism is Hugo Mercier and Dan Sperber’s view of reason in general. They think the human ability to reason evolved not for reaching truth but for enabling us to argumentatively persuade other people of things, whether they’re true or not. Reason, on this view, is like a partisan lawyer (and not much else).
Second, there’s the Reason Optimists camp. The Optimists hold that, in general, people who are more prolific reasoners will show less political bias. Their idea is that, though everyone has biases, reasoning will help people to work through inconsistencies in their positions, seek out relevant facts, and be open to persuasion by evidence. The Optimist who disagrees most prominently with Kahan about climate change psychology is Michael Ranney, whose work I also discussed in relation to Kahan’s. His studies show that informing people about the mechanisms of climate change increases acceptance. Ranney is in the Reason Optimists Camp, because his view implies that people with appropriate knowledge will be more likely to reason their way to the correct position (acceptance). Optimists also make points against Mercier and Sperber to the effect that reasoning often helps us avoid disasters (for example, “Well, if A and B both got sick and the only thing they ate in common was that weird-looking plant, then maybe…”). So reasoning can be more than rationalization.
(Side note for anyone who likes paradoxes of self-reference. An Optimist might also diagonalize against the Reason Pessimists: they seem to have arrived at their negative view of reason through reasoning; so if their negative view of reasoning is right, then we shouldn’t take their arguments seriously! Alternately, if the Pessimists’ reasoning toward their position is solid, that proves that their negative view of reason is not true in a general way, since their own reasoning is a counterexample.)
In the debate about climate change denial, I think the Reason Optimists (like Ranney) have a stronger case, but the Reason Pessimists (like Kahan) get more press. But there’s more work to be done as that debate plays out, so I won’t say more on it at this point.
Furthermore, there are other fronts on which moves are being made by the members of the two camps. And this brings us to our question about fake news. (There are, of course, different ways the phrase “fake news” gets used. The current POTUS uses it to mean any journalism I don’t like. But there is a more legitimate meaning, namely: false but sensational-seeming information that’s packaged to look like a genuine news report. I have the latter sense in mind.)
So one important question, which emerges in light of the debate between the two camps, is this: does reasoning make people more or less susceptible to fake news?
A Pessimist would say “more,” but an Optimist would say “less.” Who’s right?
On this very front, the psychologists Gordon Pennycook and David Rand recently published an interesting study, one that looks good for Optimists: “Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning.”
Their main experimental approach was to give people a Cognitive Reflection Test (CRT) and see if it predicted the extent to which people would fall for fake news (their stimuli included gems like this: “BLM Thug Protests President Trump With Selfie… Accidentally Shoots Himself in The Face. -Freedom Daily”).
The CRT tests how prone people are to overriding their initial intuitions and thinking further about a given problem or issue. Here are two items from a typical CRT.
- If it takes 5 machines 5 minutes to make 5 widgets, how long does it take 100 machines to make 100 widgets?
- A bat and a ball cost $1.10. The bat costs a dollar more than the ball. How much does the ball cost?
The answers are 100 for the first and 10 cents for the second, right? Wrong! The correct answers are five minutes and 5 cents (work it out). What’s interesting is that everyone who sees these has the same incorrect intuitive answers that come to mind. But people who are more disposed to analytic thinking are more likely to question those intuitive answers. So the CRT measures how much people are disposed (and able) to engage in reflective thinking—or reasoning.
This all gives rise to two opposite predictions. A Pessimist would predict that people who score higher in analytic thinking would be more likely to accept fake news (if it’s partisan in their direction), since that extra thinking only goes to rationalizing what supports their antecedent political position. But the Optimist predicts that people who score higher in analytic thinking will be less likely to accept fake news (even if it’s partisan in their direction), since the extra thinking helps them detect error.
This round is a win for the Optimists. Pennycook and Rand found that those who perform better on the CRT rate fake news as less accurate than poor performers do. In other words, people who do more thinking are less likely to be suckers. Importantly, this held even for fake news that supported one’s own political position. So, for example, someone who was pro-Clinton and did well on the CRT was less likely to be suckered by pro-Clinton fake news than someone who was pro-Clinton and did poorly on the CRT (and similar, mutatis mutandis, for the pro-Trump camp).
None of this implies that analytic thinkers are perfect at filtering out fake news. Far from it! Furthermore, there was a clear effect that showed people of all sorts were more vulnerable to fake news in their partisan direction than fake news on the other side. But the point is that people who were more analytic were less vulnerable than those who were less analytic. And that suggests Optimism about the value of reason.
From the practical perspective, what this also suggests is that laziness is the enemy. People usually fail to detect fake news not because their reasoning is biased, but because they just didn’t think. But note also that the laziness (failure to reason at all) is itself biased: we’re prone to subject news favorable to the opposite party to intense rational scrutiny—and then be lazy about scrutinizing information that flatters our own positions. Otherwise put, as David Dunning noted in a lecture on fake news at the Association for Psychological Science recently, skepticism of the other is not the main problem; rather, it’s uncritical acceptance of media that are favorable to one’s own side. So we should scrutinize information that’s favorable to our own political leanings more thoroughly.
Analytic thinkers do this more than other people as a matter of course. But I think the Optimist’s recommendation to reason more should go for everyone—at least when it comes to fighting fake news.