Is the Republican Party a cult? When our belief bubbles become cages

As discussed in SCI’s earlier essay When Everything Sounds True, we live in a world of fractured realities. Friends and families divided by politics, vaccines and climate change are no longer just disagreeing—they’re clinging to entirely different versions of truth itself. Each side looks at the other and sees delusion.

We’ve always seen the world through different filters, of course, and every era has had its fringe beliefs. What’s new today is that many of us have lost sight of objective truth, at least when it comes to understanding the world around us. As a society, we have misused the awesome potential of our shared public square, and have instead constructed an infinite maze of private faiths.

Many factors contribute to this maze building—our cognitive wiring, our social instincts, and the design of our information systems. But when do our information processing foibles and separate realities morph into something darker? Specifically, when do these realities turn into cults, and how can our knowledge of cults help explain what’s happening to information dynamics in modern society?

To explore this idea, let’s look more closely at the modern Republican Party—not because it’s the only example of cult-like behavior in public life, but because it’s the most visible, consequential, and well-documented case of a mainstream movement that has adopted many of the psychological and social trappings of a cult. Is it an actual cult, though? When we think of cults, we usually picture the extremes, like Jonestown, Heaven’s Gate, and the Manson Family. But sociologists and psychologists have long argued that cults exist on a spectrum. At one end are benign communities bound by shared purpose or rituals. At the other end are movements that isolate followers, demand obedience, and merge personal identity with ideology.

The sociologist Janja Lalich, who has spent decades studying high-control groups and cult recovery (see Take Back Your Life), helps us see how many cultic systems become self-sealing. In such systems, every ritual, rule, and group activity is calibrated to reinforce the same interpretive frame and resist disconfirmation. Over time, the “choice” available to members becomes bounded not by force but by the narrowing of acceptable belief, information sources, and emotional expression.

This bounded-choice structure echoes the criteria identified decades earlier by psychiatrist Robert Lifton, whose 1961 work on thought reform remains foundational (see Thought Reform and the Psychology of Totalism). Lifton described eight criteria that mark the same pattern: control of communication, sacred science, demand for purity, confession, mystical manipulation, a closed doctrine, the primacy of the group over the individual, and a system of dispensing existence—deciding who is worthy or damned.

One important dynamic that both Lifton and Lalich imply, but do not fully isolate, is the process of “othering.” In Lifton’s model, othering can be viewed as an essential subset of both “dispensing of existence” and “mystical manipulation”—the group’s means of defining who belongs and who does not—but later researchers have treated it as central in its own right. Othering, the systematic portrayal of outsiders as morally inferior or dangerous, is now widely recognized as a cornerstone of high-control group psychology. Social psychologists Henri Tajfel and John Turner, whose Social Identity Theory first described how group belonging produces “ingroup favoritism” and “outgroup derogation,” showed that people derive self-esteem not just from belonging but from believing their group is superior. Cults weaponize that instinct. José Marques and colleagues later demonstrated the Black Sheep Effect—the tendency to judge ingroup deviants more harshly than outsiders—as a way to enforce purity. Sociologists of religion such as Roy Wallis and Rodney Stark extended this idea in boundary-maintenance theory, arguing that sects and cults sustain coherence by policing the moral border between “us” and “them.” Contemporary scholar-practitioners like Steven Hassan (in his BITE Model of Behavior, Information, Thought, and Emotional control) note that modern cults often replace physical isolation with psychological walls, achieved by demonizing external voices and framing dissent as betrayal. In this sense, othering is not merely an outcome of cultic thinking—it is the engine that powers it. In modern Republicanism, that engine runs constantly: academics become elitist enemies, cities become war zones, immigrants become criminals, Democrats become traitors. The moral boundary hardens, and disagreement ceases to be political; it becomes heresy.

Speaker / date Quote Target of “othering” Source
Donald Trump — Jun 16, 2015 “They’re bringing drugs. They’re bringing crime. They’re rapists.” Mexican immigrants TIME transcription of announcement speech
Donald Trump — Jan 11, 2018 Referred to Haiti and African nations as “shithole countries.” Foreign countries / immigrants Associated Press report
Donald Trump — Feb 24, 2016 “I love the poorly educated.” Anti-elite/anti-expert signaling Quartz transcriptReuters coverage
Donald Trump — Feb 17, 2017 (and repeatedly) Called the press the “enemy of the American people.” Independent media Harvard Kennedy School analysisCommittee to Protect Journalists
Donald Trump — Nov 11, 2023 Described domestic political opponents as “vermin” to be “rooted out.” Democrats / critics Washington PostABC News
Marjorie Taylor Greene — Apr 2, 2023 “Democrats are a party of pedophiles.” Democrats / liberals ABC News (“60 Minutes” segment)
Donald Trump — Jul 2019 (re: four congresswomen) Said they “hate our country” and should “go back” to where they came from. Democratic lawmakers (women of color) Washington Post
Donald Trump — Aug/Sep 2025 Called Baltimore (and Chicago) “hellholes,” pledging federal intervention. Democratic-led cities WBAL-TV (NBC affiliate)Maryland Matters

 

Even when considering Republicanism just through Lifton’s cult lens, we can observe many cultish analogues: distrust of outside media, insularity in internal messaging, the delegitimization of dissenting voices, and the social cost of deviation. Each reinforcement subtly narrows the field of reasonable thought, making exit emotionally and cognitively harder. And applying Lifton’s criteria, communication control appears in the synchronized narratives of Fox News, Truth Social, and a constellation of allied media that filter dissent and harmonize talking points. Sacred science shows up in the unchallengeable pronouncements of President Trump, treated by supporters as revealed truth rather than opinion. Demand for purity drives RINO purges and online mobbing, where deviation invites public shaming or banishment. Confession takes the form of ritualized repentance from figures who momentarily stray, followed by orchestrated forgiveness upon renewed loyalty. Mystical manipulation reframes ordinary events as divine signs or evidence of hidden enemies, reinforcing the sense of cosmic struggle. Doctrine over person requires members to subordinate lived experience and conscience to the narrative. Primacy of the group dictates that moral worth depends on obedience to the movement. And dispensing of existence casts defectors and critics as enemies of the state, communists, or traitors. These mechanisms, expressed through mass media rather than robes and rituals, achieve the same psychological cohesion that once required physical isolation.

Principle Trait / belief % R % D Source
Control of communication & sacred science Scientists are dishonest¹ 48 20 Pew Research Center: Public Trust in Scientists (2024)
  The mainstream media intentionally lies 75 30 Gallup: Trust in Media at New Low (2025)
  It’s okay to delay or skip childhood vaccines 26 5 KFF: Trust in Public Health & Vaccines Falls (2025)
  January 6 rioters were protecting democracy 52 4 ABC / Ipsos (2022)Newsweek summary
Demand for purity, confession & doctrine over person Party officials who criticize the president should be ostracized 63 42 Pew: Partisan Hostility / Two-Party Frustration (2022)
  The FBI helped organize or encourage Jan 6 attack 34 13 Washington Post / UMD Poll (2024)
  Trump bears no responsibility for Jan 6 78 11 Newsweek summary of YouGov / Ipsos (2023)
  Trump won the 2020 election 77 6 CBS News / YouGov Poll (2025)
Mystical manipulation & dispensing of existence Trump dodged assassination because God wanted him president again 60 Fox News Poll (Apr 2025)
  The U.S. should be declared a Christian nation 61 17 PRRI / Brookings Christian Nationalism (2023)
  Public schools are harming children’s morality 71 19 Pew Research Center: Americans’ View of Public Education (2024)
  Violence may be necessary to save the country 28 8 PRRI: Threats to Democracy Survey (2024)
  America’s best days are behind us 67 33 CBS News Poll (2024)

¹ Inverted from “scientists are honest,” which 52 % of Republicans and 80 % of Democrats agreed with; shown here as the complement for consistency across negative frames.

To be clear, not every form of passionate belonging is pathological. Human communities run on shared stories, rituals, and symbols. We cheer for teams, volunteer for causes, and join organizations that give us meaning. These can be healthy, pro-social expressions of identity—what sociologists describe as the benign end of the cult spectrum. The danger begins when belonging demands blindness, when dissent becomes treason, when loyalty outranks evidence, and when leaders claim moral infallibility.

Unfortunately for all of us, the cult trap is easy to fall into. The human mind is perfectly equipped for this kind of confinement. We are tribal animals who evolved to survive in small, cooperative groups. Trusting insiders and distrusting outsiders kept us alive. But modern culture, with its infinite channels of information, has turned those instincts against us. Our cognitive biases do the rest. Confirmation bias makes us seek evidence that flatters our existing beliefs. The backfire effect causes contradictory facts to strengthen those beliefs. Social identity theory tells us we derive self-worth from the groups we belong to. Once our worldview becomes fused with who we are, changing our mind feels like losing ourselves.

Accelerating our fall, technology today has supercharged this dynamic. Social media platforms are not neutral mirrors of reality. They are persuasion engines tuned for engagement. Lies and outrage spread faster than truth and humility because they keep users clicking. Each click refines the algorithm’s picture of what we want, narrowing our view of the world until the only voices left are those that sound like ours.

The psychologist Philip Zimbardo, best known for the Stanford Prison Experiment and his later work on situational power in The Lucifer Effect, reminds us that powerful systems of influence are seldom magical. Instead, they are escalated everyday pressures. His research shows how ordinary people under institutional and situational constraints can yield to authority and group norms, even against their better judgment. In the digital realm we see intensified versions of the same forces: algorithms that reward conformity, social feedback loops that punish deviation, and peer norms that define acceptable belief. These are not exotic control techniques. They are the amplification of influence at scale.

So, power can shape behavior easily. But if we have the luxury of pulling back the curtain, we can ask who is wielding this power, and to what end? Leadership is the hinge on which high-commitment groups swing toward service or harm. Sociologist Max Weber called this charismatic authority—influence that rests on a leader’s perceived exceptional gifts rather than on rules or tradition—and it is uniquely powerful at remaking norms (see Economy and Society). Contemporary leadership research adds that leaders succeed when they cast themselves as the embodiment of a shared “us,” define who “we” are, and use rituals and language to turn identity into action (see S. Alexander Haslam, Stephen D. Reicher, & Michael J. Platow, The New Psychology of Leadership). When that identity is built around threat and purity, conformity hardens and dissent collapses. In cultic settings, Lalich shows how charismatic leaders help “seal” the social system so members’ choices narrow to what the doctrine permits; Zimbardo reminds us this isn’t magic but the escalation of everyday influence under situational pressure.

Through this lens, the Republican Party’s cult-like turn was not inevitable, but it was foreseeable. It reflects decades of strategic rhetorical engineering and identity work—escalating in the 1990s GOPAC era, when Newt Gingrich and pollster Frank Luntz perfected the vocabulary of division. Their internal memos encouraged candidates to describe Democrats as “traitors,” “corrupt,” and “anti-family,” transforming politics from a contest of ideas into a struggle of moral purity (see GOPAC memo, “Language: A Key Mechanism of Control”). Over the ensuing decades, the continued use of demonization and fear-mongering elevated loyalty above deliberation, narrowed the band of acceptable thought, and gradually ostracized leaders who were insufficiently orthodox. As Jonathan Haidt showed in The Righteous Mind, once moral judgment fuses with group identity, reason becomes subservient to belonging; people defend the tribe, not the truth. The party’s rhetorical ecosystem—amplified by partisan media—thus evolved into a self-reinforcing moral community built on grievance and threat perception. Within that structure, it became nearly impossible for moderates to steer the party back toward openness or pluralism. The natural endpoint of that dynamic is a demagogue who thrives on grievance, demands purity tests, and vilifies defectors.

In both religion and politics, the organizational structure is rarely neutral, but its stewardship determines whether it heals or harms. Charismatic leaders can sanctify inclusion and humility, or weaponize certainty and fear. In the Republican Party’s case, the structure had already been primed for authoritarian appeal. As Steven Levitsky and Daniel Ziblatt note in How Democracies Die, when political institutions tolerate norm-breaking and treat rivals as existential enemies, they clear the runway for authoritarian leadership. Donald Trump did not invent this psychology, he simply personified it. His narcissism and performance skills made him the perfect embodiment of a party culture that had long elevated rhetorical dominance over actual political service and representation. Steve Kornacki’s The Red and the Blue traces this polarization back to the 1990s, when the quest for permanent campaigning replaced governing, and partisanship became a form of identity. And Rachel Maddow’s Prequel reminds us that movements built on fear of “the other” are not new in American politics; they have always tested the nation’s democratic immune system. What is new is the technological infrastructure that allows these impulses to instantly stress this immune system to the breaking point.

But citizens aren’t necessarily unwitting victims of a rigged system and unscrupulous leaders. As early as 1961, William Lederer warned in A Nation of Sheep that American democracy was eroding not from external enemies but from citizens’ willingness to be led. His “sheep” were not evil; they were complacent—content to outsource judgment to authorities and media gatekeepers. Six decades later, his metaphor feels prophetic. What Lederer feared—a society too distracted or deferential to resist manipulation—has been technologically perfected.

There are other dynamics and parallels to consider as well. Lalich warns that in bounded-choice systems, the real threat isn’t just authoritarian commands but the slow erosion of self-reference. Members are trained to interpret their own doubts through the dominant group lens. Zimbardo similarly cautions that moral disengagement can follow when individuals stop distinguishing between their own reasoning and the group’s logic. Together, these theories help explain why criticism from within is often greeted as betrayal and why many adherents find they cannot even articulate what would change their mind. “My congressman used to have a spine” has an actual psychological explanation, apart from the obvious political survival and re-election campaign explanation.

For some, the fusion of faith and politics adds yet another layer of entrapment. Sociologists of religion note that movements already organized around absolute truth and charismatic authority—such as certain strands of American evangelical Christianity—can easily merge spiritual and political devotion. This isn’t about theology but structure: the comfort of moral certainty, the discipline of obedience, and the social reinforcement of public piety. In these environments, displays of faith often become performative acts of loyalty that signal belonging as much as belief. The result can resemble a political revival, where allegiance to a leader is framed as fidelity to God and doubt feels like sin.

This performative fusion didn’t appear overnight. As historian Kristin Kobes Du Mez argues in Jesus and John Wayne, much of modern white evangelicalism has been transformed from a faith centered on humility and compassion into a culture of militant masculinity and divine entitlement. Over decades, evangelical media and leaders cultivated an ideal of the Christian strongman who protects “God’s nation” through dominance rather than service. This theology of power dovetailed with the politics of white grievance, producing what sociologists Samuel L. Perry and Philip S. Gorski call white Christian nationalism—a movement that equates American identity with white, conservative Christianity. In this worldview, secularism, feminism, racial equality, and immigration are not social trends but spiritual assaults. The result is a feedback loop: white Christian nationalism fuels Republican identity, and Republican politics sanctify that nationalism. Each masquerades as faithfulness to a higher calling, but together they form a self-reinforcing creed—a cult of Christianized nationalism that trades the teachings of Jesus for the promise of power.

President Trump in the Oval Office in 2025 as a circle of evangelical leaders lay hands on him (Official White House Photo by Molly Riley; public domain).

In recent years, scholars of radicalization have noticed how around the world, more and more political movements, not just Republicans, use cultic techniques like this—charismatic leaders who claim unique insight, narratives of persecution, purity tests, and the vilification of defectors. The danger in this approach is not that any one faction will turn into Jonestown but that entire societies will normalize cult logic, the conviction that only one side is virtuous and everyone else is evil. When this happens, facts lose their power, dialogue collapses, and the public sphere turns into a contest of moral tribes. We stop asking “What’s true?” and start asking “Who’s with us?”

Seen through this long arc then, the story of the modern Republican Party is a case study in how democratic institutions can be eroded not by coups, but by human behavior, media evolution and applied psychology, with maybe just a hint of sinister intent for good measure (but unintended consequences as well, as the creation ate its own members). Is there anything we can do to reverse this process and undo the damage it has caused (assuming we want to, of course)? If the mechanics of belief are so deeply wired, and our reality universe so fractured and unmoored, can we find our way back?

Cult-exit counselors have studied this question for decades, and their findings are strikingly relevant to today’s information crisis. First, confrontation rarely works. Telling someone they’re brainwashed only strengthens their defensive walls. What helps is curiosity—asking genuine questions that can help restore a person’s capacity for independent thought. “How do you know?” “What would change your mind?” “Can both of these things be true?” Each question reawakens the muscle of doubt.

Second, exit is easier when people have somewhere else to belong. Most cult members who leave don’t do so because they suddenly reject the doctrine. They leave because they find another community that meets their emotional needs without the manipulation. Likewise, rescuing people from information cults will require building healthier spaces for connection, offline and online, where curiosity rather than certainty earns status.

Finally, recovery takes time. Deprogramming isn’t just about rejecting false ideas but rebuilding the habits of critical thinking and trust that cult life erodes. On a social scale, that means investing in media literacy, civic education, and public institutions that model transparency and humility. It means normalizing the sentence “I might be wrong.”

None of this implies that strong conviction is inherently dangerous. Humans need belief systems. The question is whether those systems allow for inquiry, discovery, and doubt. Faith can coexist with humility. Cults cannot. A functioning democracy depends on this distinction. When ideology replaces inquiry, politics becomes religion by another name, and truth becomes heresy.

It’s also important to note that the cure for cult thinking isn’t necessarily more fact-checking or censorship. The psychologist Margaret Singer, one of the foremost experts on coercive persuasion (see Cults in Our Midst), once wrote that people leave cults when they rediscover their capacity to love and be loved outside the group. The same may hold for our fractured information world. People change not when they lose arguments but when they regain connection.

Rebuilding a shared sense of reality will therefore require a new kind of social infrastructure—spaces where disagreement is safe, where facts are respected but empathy is central, and where belonging doesn’t depend on uniformity of belief. The goal isn’t to force everyone into one narrative but to rebuild enough trust that truth can again be a conversation rather than a weapon.

We might think of this as reality repair—the slow, collective process of re-stitching the social fabric torn by decades of manipulation and mistrust. It begins not with technology or policy but with culture—with how we talk to each other, how we teach skepticism without cynicism, and how we reward humility instead of certainty.

Cults, in the end, are what happen when people lose faith in the possibility of shared truth and retreat into self-sealed worlds. If our societies are drifting that way, then the way out is the same as it ever was: curiosity, compassion, and the courage to create a world that rises above our fears and weaknesses and celebrates truth, respect, and our common humanity.


Additional reading:


This post was edited on October 11 (the table was added) and October 12, 2025 (the paragraph on othering was added).