What happens when a lie hits your brain? The now-standard model was first
proposed by Harvard University psychologist Daniel Gilbert more than 20 years ago. Gilbert argues that people see the world in two steps. First, even just briefly, we hold the lie as true: We must accept something in order to understand it. For instance, if someone were to tell us—
hypothetically, of course—that there had been serious voter fraud in Virginia during the presidential election, we must for a fraction of a second accept that fraud did, in fact, take place. Only then do we take the second step, either completing the mental certification process (yes, fraud!) or rejecting it (what? no way). Unfortunately, while the first step is a natural part of thinking—it happens automatically and effortlessly—the second step can be easily disrupted. It takes work: We must actively choose to accept or reject each statement we hear. In certain circumstances, that verification simply fails to take place. As Gilbert writes, human minds, “when faced with shortages of time, energy, or conclusive evidence, may fail to unaccept the ideas that they involuntarily accept during comprehension.”
Our brains are particularly ill-equipped to deal with lies when they come not singly but in a constant stream, and Trump, we know, lies constantly, about matters as serious as the election results and as trivial as the tiles at Mar-a-Lago. (
According to his butler, Anthony Senecal, Trump once said the tiles in a nursery at the West Palm Beach club had been made by Walt Disney himself; when Senecal protested, Trump had a single response: “Who cares?”) When we are overwhelmed with false, or potentially false, statements, our brains pretty quickly become so overworked that we stop trying to sift through everything. It’s called cognitive load—our limited cognitive resources are overburdened. It doesn’t matter how implausible the statements are; throw out enough of them, and people will inevitably absorb some. Eventually, without quite realizing it, our brains just give up trying to figure out what is true.
But Trump goes a step further. If he has a particular untruth he wants to propagate—not just an undifferentiated barrage—he simply states it, over and over. As it turns out, sheer repetition of the same lie can eventually mark it as true in our heads. It’s an effect known as illusory truth,
first discovered in the ’70s and most recently demonstrated with the rise of
fake news. In its original demonstration, a group of psychologists had people rate statements as true or false on three different occasions over a two-week period. Some of the statements appeared only once, while others were repeated. The repeated statements were far more likely to be judged as true the second and third time they appeared—regardless of their actual validity. Keep repeating that there was serious voter fraud, and the idea begins to seep into people’s heads. Repeat enough times that you were against the war in Iraq, and your actual record on it somehow disappears.