When Truth Loses

Why Explaining Yourself Doesn’t Work, and What to Do Instead

Lies flash through the air like lightning. They can be quick, shocking, and impossible to ignore. In contrast, truth typically arrives later, like distant thunder, rolling in after the damage is done. Our political climate is filled with these storms. Long ago, politicians recognized the power of a fast, hard lie. It’s frustrating, but lies don’t wait for context or accuracy. They don’t pause for facts. They move swiftly, lodging themselves in people’s minds. Once they’re in, they start shaping belief systems. By the time the truth arrives, it’s almost impossible to dislodge the first impression. That’s why post-debate fact-checkers, no matter how skilled, rarely change opinions. No one’s impressed by thunder after the flash.

Have you ever seen what happens when a politician is hit with a bold, unexpected lie? They often freeze. Their eyes widen, their mouth tightens, and they pause. It’s not because they don’t have a response; it’s because their brain has momentarily locked up. Psychologists call this an “acute stress response.” We all know about “fight or flight,” but there’s a third option: freeze. It’s the brain’s way of pausing to assess a threat. In that split second of silence, the liar gains control of the moment. They look strong and confident. Their opponent looks shaken and weak as they go silent.

Silence is dangerous because it allows lies to move into people’s subconscious. Additionally, when lies are repeated, they begin to feel like the truth. This phenomenon occurs due to how the human brain comprehends and retains information. The brain doesn’t store facts like a filing cabinet; instead, it builds knowledge through associations, linking one idea to another and developing layers of these abstractions over time. Think of it as a pyramid: early impressions, repeated phrases, and emotionally charged messages create the foundation. When new information emerges, it gets stacked on top. The more emotional and repetitive the message, the stronger the associations become. The pyramid grows taller and wider over time.

But here’s where it gets tricky. If the foundation is built on lies, everything above it, including opinions, voting behavior, and identity, rests on a shaky foundation. When the truth finally arrives and begins to tug at the knowledge base, it threatens to topple the entire structure. That’s when people experience a psychological discomfort known as “cognitive dissonance.” It’s a type of mental friction that comes up when new information challenges long-held beliefs. Rather than endure that discomfort, many people reject the new information or rationalize it away. Or worse, they attack the messenger. Why? Because it’s easier to maintain the lie than to rebuild the truth from the ground up. If this continues over time, the belief pyramid can solidify into a cult mindset. At that point, even irrefutable evidence gets brushed off as part of a conspiracy. That may be why wild conspiracy theories are so popular; they shield the belief structure from collapse. They create a defensive hardening of the original belief. Some individuals may even resort to violence if you push the truth at them too forcefully.

So, how do you respond when the lies are flying fast?

First, understand this: if you’re forced to explain yourself, you’ve already lost the argument. People say they want “receipts,” but often it’s just a trap. By the time you dig out your sources, they’ve tuned you out. Explanations are slow. Lies are fast. It’s an unfair fight, but there are ways to win it.

One strategy is to break the rhythm. If you notice your opponent trying to box you in with loaded hypotheticals and demanding a “yes” or “no” to questions with no good answer, don’t play their game. Disrupt it. For example, if someone begins painting a tragic scenario about an undocumented immigrant causing a horrible car crash, don’t follow their script. Instead, fight absurdity with absurdity by interrupting their story with something like, “What if the car was a Tesla on autopilot? Could we have avoided the car crash by deporting Elon Musk?” This approach jolts listeners out of the narrative trap and exposes the ridiculousness of the whole thing. What you’re actually doing here is triggering the brain’s “BS” detector before the lie can be completed. Once people realize it’s BS, they will reject it outright and never let it settle in.

Another tactic is called “symbolic disruption.” When your opponent launches a rapid-fire stream of lies, don’t chase each one, because you’ll be caught in the trap of endlessly explaining yourself. Instead, smile, bend over, and roll up your pant legs. You’ll probably get some confused expressions, and at that moment, explain: “The BS is getting deep in here.” That simple gesture gets a laugh, disarms the attacker, and flips the dynamic. Most importantly, it interrupts the lies before they can settle in.

Make no mistake: we’re not just fighting bad information. We’re fighting structures of belief. We’re trying to prevent the building of corrupted knowledge pyramids through repetition, emotion, and group identity. You don’t win those battles with data alone. You don’t win with long-winded explanations. You win with confidence, strategy, and timing. You win by understanding that the battlefield is not in the facts but in the brain.

Lies will always move faster, but you can disrupt them and keep them from doing damage. And the truth, if delivered with clarity and force, can find its way into people’s foundations of knowledge.