The Research Origin

The illusory truth effect was first formally documented by Hasher, Goldstein, and Toppino in a 1977 study published in the Journal of Verbal Learning and Verbal Behavior. Participants were presented with a series of plausible statements and asked to rate their confidence in each statement's truth. Two weeks later, the same participants saw a new set of statements, some of which had appeared in the first session, some of which were new. The repeated statements were consistently rated as more likely to be true, even though no new evidence had been provided and two weeks had elapsed.

The effect is driven by what psychologists call "processing fluency", the ease with which the brain processes a stimulus. Repeated exposure makes information easier to process. The brain interprets ease of processing as a signal of familiarity, and familiarity as a signal of truth. This is a reasonable heuristic in a world where repeated exposure usually means an idea has been encountered in multiple reliable contexts. It becomes a vulnerability when the repetition is engineered.

Further reading: APA Dictionary of Psychology

Why Knowledge Doesn't Protect You

The more troubling finding from subsequent research is that the illusory truth effect persists even when participants know the repeated statement is false. In a 2015 study by Lisa Fazio and colleagues at Vanderbilt University, participants who correctly identified a statement as false in one session still rated it as more true after repeated exposure in a subsequent session. Prior knowledge does not provide reliable immunity.

This finding has significant implications. It means that fact-checking, the primary institutional response to misinformation, may have limited effectiveness when false claims are repeated sufficiently often. The correction operates through the conscious, deliberate system of belief evaluation. The illusory truth effect operates through the automatic, processing-fluency system. The automatic system is faster and, under the conditions of ordinary information consumption, more influential.

"If you tell a lie big enough and keep repeating it, people will eventually come to believe it." The mechanism behind this observation is not political theory, it is documented cognitive psychology, operating in all populations regardless of the content of the lie.

Political and Commercial Applications

Political communication professionals have operated on the principle of repetition long before the cognitive science was documented. Frank Luntz, the Republican pollster and communication strategist, has written extensively about the importance of consistent message repetition. His firm's research demonstrated that particular word choices, repeated consistently across candidates, media appearances, and advertising, shifted public perception on policy issues, not by presenting new arguments, but by making certain framings feel natural and self-evident through familiarity.

The shift from "estate tax" to "death tax" is a documented example. The estate tax, a tax on inherited wealth above a significant threshold, affected fewer than 2% of estates at the time of its political salience in the early 2000s. Polling under the "estate tax" label showed limited opposition. Polling under the "death tax" label showed substantially higher opposition, concentrated among people who would never be subject to it. The policy had not changed. The language had. The repetition of the new framing across conservative media and political communication shifted public perception within a single election cycle.

Commercial advertising operates on the same principle. Brand recall studies consistently show that repeated exposure to a brand name increases preference for that brand independent of any product information. The "mere exposure effect," documented by Robert Zajonc in the 1960s and closely related to the illusory truth effect, shows that familiarity alone generates positive affect. This is why advertising budgets favor reach and frequency over message quality in many categories, the goal is not to persuade through argument but to establish familiarity through repetition.

The Correction Problem

Research on belief correction reveals a documented pattern that complicates the fact-checking enterprise. Brendan Nyhan and Jason Reifler identified the "backfire effect" in 2010, a phenomenon where correcting a false belief with factual counter-evidence sometimes strengthens rather than weakens the false belief, particularly when the belief is tied to identity or prior commitments. While subsequent research has found the backfire effect to be less reliable than initially reported, the broader finding holds: corrections are less effective than initial exposures, and the gap widens with repetition.

The practical implication is that in any information environment where false claims are repeated with higher frequency than corrections, the false claims will tend to win, not because they are more convincing, but because they are more familiar. Frequency of repetition is a competitive advantage in belief formation that operates largely independently of truth value.

The Effect in Action: Recognition Signals

  • You find yourself believing something you cannot trace to evidence, only to repeated exposure
  • The same phrase or framing appears identically across multiple media sources within a short period
  • A claim feels "obviously true" without a clear memory of when or why you came to believe it
  • Corrections to a belief you hold produce less certainty change than the frequency of the original claim would predict
  • You notice that unfamiliar but accurate information feels less credible than familiar but inaccurate information

What You Can Do

Source tracking is the most accessible defense: when you hold a belief, ask when you first encountered it and how often you have encountered it since. If the answer is "many times" and the source is difficult to trace to original evidence, the belief may be held on the basis of fluency rather than substance. This is uncomfortable, it requires treating your own confident beliefs as potentially suspect, but it is the only cognitive operation that can interrupt the automaticity of the effect.

Deliberate exposure to well-argued contrary views can also recalibrate fluency-based beliefs, though the effect is slow and requires sustained effort. The faster intervention is simply slowing consumption of any single media environment. Illusory truth is a frequency effect. Reducing frequency reduces its power.


Back to Playbook All Articles