When everything sounds true: Free speech and the fracturing of reality

Why is our society so divided between right and left? The answer is complicated. Today’s polarization is the product of many forces converging at once. Nowhere is this more visible than in our debate about free speech.

Legally, free speech is the fundamental right to express ideas and opinions without government interference, punishment, or censorship. This liberty has served society well for centuries — indeed, the free flow of information is a cornerstone of democracy.

Yet the free flow of information is only part of the equation. What matters just as much is what kind of information is flowing, how it spreads, and how we make sense of it.

Let’s consider this last factor first. Each of us perceives the world through our reality filters, which are shaped by our social, psychological, and physiological wiring. We are, for example, predisposed to seek patterns even where none exist, favor evidence that confirms what we already believe (confirmation bias), give more weight to the first thing we hear (anchoring), and remember vivid anecdotes more readily than dry statistics (availability bias). Social belonging also tilts our judgments. We instinctively trust what comes from our tribe and discount what comes from outsiders.

These tendencies are our default settings as humans, but communication tools can turn them into vulnerabilities. Every time technology changes, societies have had to struggle anew to separate truth from fiction. In modern times, television, email, spam, clickbait, and social media have each forced a reset. Five centuries ago, the printing press did the same. As historian Adrian Johns recounts in The Nature of the Book, the flood of pamphlets, books, and broadsides created both opportunity and chaos. Readers couldn’t always tell which texts were authoritative, which laws were genuine, or which works were outright forgeries. It took generations of publishers, editors, and institutions to build systems of trust, from licensing regimes to stamps of authentication on title pages.

Today we face a parallel moment. Modern communication systems exploit our cognitive wiring, turning lazy biases into active engines of division. All the forces we’ve long known — disinformation spreading lies deliberately, misinformation spreading them carelessly, gaslighting insisting that what you see with your own eyes is false, false equivalency giving every claim equal weight, and belief perseverance keeping people loyal to ideas even after they’ve been debunked — now move at the speed of the algorithm, reaching millions instantly. Layered together, these forces don’t create one universal distortion. They create many distortions, a hall of mirrors where truth and reality are reflected and refracted beyond recognition.

The consequences of these dynamics go far beyond confusion. Today, friends, families, and neighbors live in their own custom-tailored information worlds. In one, Anthony Fauci is a hero. In another, he is a criminal mastermind. In one, Charlie Kirk is a beacon of conservative thought. In another, he is a hatemonger. In one, Donald Trump is the second coming of Jesus. In another, he is the antichrist. In one, COVID vaccines saved millions of lives. In another, they killed millions. We no longer share a baseline of truth. And algorithms feed us the posts and videos most likely to keep us siloed — not the ones most likely to be accurate. Division isn’t an accident; it’s the business model.

This isn’t just an American problem. A recent Pew Research Center survey found that while strong majorities worldwide value free speech and a free press, far fewer believe they actually have these freedoms in their own countries. At the same time, majorities in over half the nations surveyed say that “made-up news” is a very big problem, with more educated citizens far more concerned than less educated ones. Freedom of speech protects our right to expression, but it does not protect us from consuming lies.

The result is a society at war with itself, where families split and communities fracture. Political opponents stop seeing each other as wrong and start seeing each other as evil. That shift — from disagreement to dehumanization — is what makes intimidation possible. If Fauci is just another opinion-holder, then calling him a liar is fair game. If he is a criminal, then prosecuting him makes sense. If he is a traitor, then threatening him feels righteous. Once we lose track of objective reality, all actions feel justified.

It would be easy to stop here and despair. Yet we can’t. When we turn off the noise, even just cheering together at a Little League game, we are reminded there is far more that unites us than divides us. No one wants to end cancer research in the name of politics. No one wants to live in a society where masked law enforcement officers can arrest citizens without cause. No one wants to see farms and small businesses taxed out of existence. Good, reasonable people can still find common ground and work together for the common good.

And examples of this are everywhere. Countries are collaborating to bypass the economic damage of US policies. State leaders are building alliances to create evidence-based health standards. Journalists are fighting for truth, even as the government threatens their businesses and safety. These actions matter, because they remind us that division is not inevitable — it is engineered. And what is engineered can be re-engineered.

Our recent history suggests where to start. In the broadcast era, Americans faced another crisis of trust. After World War II, the growing influence of television gave partisan voices national reach, so the Fairness Doctrine, established in 1949, required broadcasters to cover controversial issues in a manner that was “honest, equitable, and balanced.” The Reagan administration repealed the rule in 1987, arguing it was no longer necessary in a world of expanding channels and that it restricted broadcasters’ First Amendment rights. Analysts note, however, that the repeal wasn’t the only reason partisan talk flourished. Technological change — particularly the falling costs of satellite broadcasting — gave stations the ability to syndicate new voices nationally, while subsequent deregulation, especially the Telecommunications Act of 1996, accelerated media consolidation and the spread of partisan content. The end of the doctrine didn’t single-handedly create Rush Limbaugh, but it removed a guardrail at the very moment technology made national syndication possible.

Calls to reinstate the Fairness Doctrine still surface, though most legal scholars agree the chances of it coming back are essentially zero. Courts today would almost certainly strike down any attempt to compel “fairness” in cable, satellite, or online media. Even if we cannot — and should not — revive a mid-20th century rule, societies may need to design new ways to signal that platforms calling themselves “news” are held to some baseline standard of accuracy, much as doctors, lawyers, and engineers must meet minimum standards of competence before they can practice. Restoring some measure of accountability in how news is defined and delivered would help rebuild the trust that free societies depend on.

To be clear, this is going to feel a bit like putting the genie back in the bottle. Yet we’ve reached the point where infotainment for many people is indistinguishable from actual news, and for the sake of society, we need to make truth easier to recognize once again. A virologist’s voice on pandemics should carry more weight than a radio show host’s. A climatologist’s findings should not be presented as equal to a coal executive’s spin. Balance is not fairness when the evidence is lopsided. Pretending otherwise is journalistic malpractice, and it corrodes the very foundation of public trust. The internet is still filled with a million avenues for opinion and advocacy. But the news outlets our cable companies carry, and that people give more weight, should be held to a higher standard.

Even if we labeled certain programming as “fake news,” though, would viewers care, or would the label simply harden their allegiance? Fox offers a telling example. It has been the most-watched news channel in America for two decades running. In the 2022 defamation case brought by Dominion Voting Systems, Fox’s attorneys argued that a “reasonable viewer” would not take certain on-air claims — such as those made by Tucker Carlson — as statements of fact, reinforcing the perception that much of its programming functions more as commentary or entertainment than as straight news. In effect, Fox conceded that its most provocative shows were not news in any traditional sense. Yet even that admission did nothing to blunt its audience loyalty or diminish its influence — a sign that the label “news” has lost much of its meaning.

So perhaps labels alone aren’t enough. The answer may lie in some combination of labeling and regulation, requiring the distributors who carry these programs to make clear what they are. We’ve been here before. When the printing press destabilized trust, publishers tried to restore confidence by stamping books with official marks of authentication, licenses issued by governments that signaled legitimacy. It was imperfect and often political (which would be incredibly problematic in today’s America), but it gave readers a way to distinguish sanctioned texts from forgeries. Imagine a modern parallel in which opinion shows masquerading as news were prefaced with a disclaimer, the way prescription drug ads must warn viewers about side effects. It may sound fantastical, but we will not win the fight against disinformation by continuing to treat it with kid gloves. The problem is real, it is endemic, and it is corroding the foundation of society.

Another re-engineering fix is algorithms. If social media algorithms are driving disinformation and division, they need to be regulated. Choosing not to do so as a business decision is reckless. If social media companies were coal plants spewing toxic smoke, they would be regulated without hesitation. Their business model rewards outrage and division because anger keeps us clicking longer than accuracy does. Holding these companies accountable (in the “let’s work together on solutions” sense of the word) for the damage their products cause isn’t censorship — it’s common sense.

Defenses matter too. Prebunking — also called inoculation — is one of the most effective. It works on the premise that people are less vulnerable to misinformation if they are exposed in advance to the tricks that will be used against them. Like a vaccine, prebunking offers a weakened version of the false claim, not to persuade but to show how the deception works. This forewarning creates mental antibodies. When people later encounter the actual claim, they are more likely to recognize it as manipulative. Research shows prebunking is more effective than debunking after the fact, because once false ideas take hold, belief perseverance makes them stubbornly hard to shake.

Information literacy is another defense — a civic survival toolkit rather than a single tactic. True literacy means knowing how to question sources, distinguish reporting from opinion, understand evidence and uncertainty, and recognize the difference between fact-based expertise and partisan spin. It also requires soft skills: empathy to hear others out, humility to admit when we might be wrong, curiosity to ask for evidence, and patience to resist instant outrage. Some of this training is already happening, but unevenly. In K–12 schools, over 90% have library media centers, yet only about 60% employ full-time certified librarians, and nearly 30% now report having no librarian at all. At the college level, information literacy is embedded deeply in library science programs and traditionally in the humanities, especially history, but these fields have been in steep decline as universities shift toward STEM. For the general public, there are bright spots — local library workshops, community-based trainings, even pilot efforts by platforms to label or prebunk misleading content — but no widespread system. Expanding these efforts is not about inventing information literacy from scratch. It’s about scaling what already works.

At the same time, the institutions entrusted with discovering and protecting truth should consider doing more to showcase how they serve the public. Journalists, historians, scientists, universities, libraries, museums, and government agencies all play vital roles in the truth ecosystem, and all are under withering attack from political operatives who reject facts that don’t serve their agenda. None of these professions can take trust for granted. The more the public sees that their processes are messy, self-correcting, and open to challenge, yet also open and accountable, the harder it is to caricature them as conspiratorial or corrupt.

The path forward is not simple. It requires patience, humility, and a willingness to confront uncomfortable truths — not just about politicians and media companies, but about ourselves. We are wired to trust our tribes, cling to our beliefs, and choose loyalty over logic. Knowing this doesn’t make us weak. It makes us human. The challenge is to build systems that work with our nature rather than exploit it — to create an information culture where truth still has a fighting chance. Without these new and improved systems, both democracy and science remain at risk. With them, we regain the possibility of a less divisive future where we can once again share the same reality.


Additional reading:


This essay was edited on 9/21/26.