Say it Enough, They’ll Believe It

20 November 2020

If you repeat something, people are more likely to believe it.

 

This isn’t speculation, and it’s not just some old saying. It’s a real phenomenon well established and extensively studied in cognitive science. It’s known as the illusory truth effect: hearing or reading a claim, especially repeatedly, makes you more likely to think it’s true. It was first documented in a 1977 publication by Lynn Hasher, David Goldstein, and Thomas Toppino. Since that groundbreaking work, a whole host of further studies have replicated the effect. The effect of repetition on apparent truth has been found for fake news headlines, it occurs with both plausible and implausible claims, and it appears whether the claims come from reliable or unreliable sources.

 

If you’re hoping that greater knowledge or intelligence would counteract this effect, you’ll be disappointed: a 2015 paper found just as much shift towards believing repeated statements that were known to be false, and a 2020 paper found that the effect isn’t significantly changed by cognitive ability. That’s not to say that nothing can dispel the illusion of truth you gain from hearing something again and again. If you tell people that a statement is false immediately after they first hear or read it, the effect is lessened. And if you repeat a false claim too much, people might see the repetition itself as an attempt to persuade them, and then be less likely to believe the claim you’re ‘selling.’ Still, the only condition that seems to completely nullify the illusory truth effect is extreme and obvious implausibility: participants in one study weren’t any more likely to believe statements like “the Earth is a perfect square” after repeated exposure to them. 

 

Why does repeating something make people more likely to believe it? Merely hearing or reading a statement doesn’t have a very consistent relationship with its truth. There are plenty of false statements out there—not just in psychological experiments, but also on propaganda websites, in advertisements, and in the little (even ‘white’) lies people tell you about yourself and about themselves every day. We live in an informationally rich environment, but one in which a lot of incoming ‘information’ actually misinforms. We know to be on our guard against this a lot of the time. We know that a lot of what we hear is false. But then why does merely hearing something make us feel it’s more likely to be true—even if we know it to be false? 

 

Most psychologists think the illusory truth effect reflects “processing fluency.” In other words, people tend to believe statements more readily if those statements are easy to process. One way something gets easier to process is repeated exposure: if you’re really familiar with some statement, it’s easier to process it when you hear it again. So if you hear something repeatedly, you’ll be more likely to believe it. And even if you don’t get all the way to believing it, you’ll rate its likelihood of being true more highly. 

 

But here’s where unification and explanation can diverge. It’s one thing just to unify the illusory truth effect with processing fluency effects, thereby placing it in the context of a broader phenomenon. It’s another to explain why the illusory truth effect takes place at all. Even once we’ve unified the illusory truth effect with other processing fluency effects, we can request an explanation: why on earth should increased processing fluency affect belief in this way?

 

We could try another tack. We could try making sense of the illusory truth effect by placing it alongside other heuristics that affect human reasoning. For instance: as per the availability heuristic, you’ll judge some potential event as more likely to happen if the event is one that readily comes to mind. This and many other such heuristics were brought together in 1974 by Amos Tversky and Daniel Kahneman, and later summarized by Kahneman in his 2011 book Thinking, Fast and Slow

 

Again, though, unification is not explanation. We can ask why we use any of these heuristics in the first place. The explanations given usually refer to our current environment, or an environment in which humans lived during our species’ evolutionary history. In the first kind of explanation, you could say (as Tversky and Kahneman did) that the heuristic usually leads to good effects in our environment, and these benefits are significant enough to outweigh any potential negative effects. In the second kind of explanation, you could say that the same pattern appeared in environments that adaptively shaped our thought.

 

But each of these explanations is dissatisfying too. It just doesn’t seem plausible that our current environment favors believing things that are repeated, given how much misinformation is around to mislead us into buying certain things and voting for certain people. As for the environments that shaped our species: it’s really difficult to confirm any hypothesis about these environments, especially such abstract ones that involve tradeoffs between believing repeated statements and gains in other forms of efficiency. (Hypotheses about our evolutionary history that are easier to confirm often involve concrete physical details, like the types of food our ancestors ate. Even so, we’re still confined to studying the bodies of a small handful of long-preserved dead ancestors.)

 

I’m still puzzled by the illusory truth effect. Belief just doesn’t seem to be the kind of thing that can be put in place by mere exposure. How on earth could a state even count as a genuine state of belief, if you can get into that state by mere exposure to a claim? If a so-called “belief” can be put in place just by repeated hearing or reading, I might question the claim that the state in question really is a belief. That’s how strange it is to think that belief could be created by repeated hearing or reading: I might sooner give up the idea that it’s belief at all than accept that it can be created thus.

 

So I don’t have a solution to the puzzle that the illusory truth effect has raised. What I do have is persisting puzzlement. Let’s not jump to quick, shaky conclusions to try to eliminate that puzzlement. Sometimes it’s worth sitting with the strangeness of something for some time.

 

Photo by Brian Wertheim on Unsplash

Comments (3)


MJA's picture

MJA

Sunday, November 22, 2020 -- 9:49 AM

True or false, right or wrong

True or false, right or wrong, we are what we are taught.
The power of education.
There is a foundation of truth, but as yet throughout the history of both western and eastern philosophy it has yet to be grasped or shared.
The day will come soon when mankind will build its castle of philosophy, physics, justice, religion, and democracy upon it, and the truth will set us free. Free at last! =

Josh Landy's picture

Josh Landy

Tuesday, November 24, 2020 -- 11:52 AM

Great post, Antonia!! I

Great post, Antonia!! I wonder whether part of the mechanism couldn't be related to our frequent inability to keep track of sources? If we hear the same statement ten times from the same person, we may misremember that as having heard it from 10 people. And if "many people" are saying something, to quote a famous quasi-politician, then, our foolish brains tell us, it must be true...

To be fair, I don't have a good evolutionary explanation to offer for why our brains not being great at keeping track of sources. But then, I tend not to assume that a particular current trait was directly selected for. Knees, spines, tracheas, teeth—in many ways, humans are pretty lousily designed. Spandrels, byproducts, and vestiges oh my! :)

Tim Smith's picture

Tim Smith

Friday, November 27, 2020 -- 4:00 PM

Illusory truth has the

Illusory truth has the ability to warp our world. That ability brought us Obama and Trump who both mastered the social media repeating untruths to get elected.

Cookie and cloud based tracking and marketing creates illusory consumerism as well.

Strange for sure but also weirdly comforting. Who is not comfortable with their politic or mind... no one. That is the real illusion.

Thanks Antonia for a good read. I didn't know about this early work or the nomenclature (I don't think Kahneman used that...?) I recently read 'How Behavior Spreads' by Damon Centola. This post resonates with that text strongly- but again doesn't reference the 'illusory truth effect' by name. Kahneman is worth another read as well as these links you provide.