Propaganda Is Normal, Folks
But, Authoritarianism Is What Happens When It Stops Being Contestable
Roger Pielke has started a thoughtful series on propaganda and democracy that is well worth reading. His opening post lays out a sober diagnostic that modern politics is saturated with persuasion, framing, and narrative competition, and this is not a deviation from democratic norms—it is the system functioning as designed.
👉 https://substack.com/home/post/p-183453568
I want to extend that argument—not to restate it, but to add the political psychology that explains why propaganda works, why smart people are often especially susceptible to it, and why the line between democratic persuasion and authoritarian control is thinner than we usually admit.
Propaganda Is Not a Bug. It Is the Interface.
The word propaganda still conjures totalitarian states, censored media, and Orwellian ministries of truth. Fair enough, and that imagery is historically grounded but analytically misleading. Propaganda is not synonymous with lies.
At its core, propaganda is strategic persuasion under conditions of asymmetric attention, identity, and emotion. Democracies have propaganda. Corporations have propaganda. Universities have propaganda. Journalists have propaganda. Activists have propaganda. States with free speech have propaganda.
You and I engage in persuasive framing every time we select which facts to highlight and which frames to use.
So, the real distinction is not whether propaganda exists, but whether it is pluralistic, contestable, and reversible—or it is monopolized, enforced, and fused with coercion.
Identity Is the Weak Nuclear Force of Politics
In a previous post, I described identity as the “weak nuclear force” of politics: a low-level, constant binding energy that quietly structures everything else.
👉 https://kylesaunders.substack.com/p/the-weak-nuclear-force-of-politics
Facts, arguments, and institutions orbit identity, and they (very) rarely escape it.
So, propaganda works not because people are ignorant, but because persuasion attaches itself to identity. Political beliefs are not just opinions—they are social signals about who we are, who we trust, and who we belong with.
Once persuasion is filtered through identity, evidence becomes ammunition. The question is not “Is this true?” but “Is this ours?”
Smarter People Are Often More Susceptible To This
One of the most uncomfortable findings in political psychology is that political knowledge does not reliably protect against identity-consistent misinformation, and in some domains can actually amplify polarization.
Dan Kahan and colleagues have shown that higher numeracy and scientific literacy can increase identity-consistent polarization: cognitively sophisticated individuals are better at defending group commitments with selective evidence. (**SEE BOTTOM FOR ADDENDUM ON THIS POINT, 2 FEB 26**)
👉 https://www.culturalcognition.net/blog/author/kahan/
In one formulation, politically motivated reasoning is the tendency to credit or dismiss evidence in ways that reinforce group identity. Intelligence supplies better lawyers for the brain’s preferred conclusion.
This pattern also appears in conspiracy research and partisan misperceptions. Political knowledge provides more tools for rationalization, not necessarily more epistemic humility.
Propaganda Is Dopamine
Propaganda is not just cognitive (about beliefs and arguments). It is affective (about emotions that drive attention, memory, and action).
Political communication that triggers anger, fear, pride, contempt, or moral outrage generates engagement, social status, and psychological reward. This is not accidental. Emotions motivate participation and signal group membership.
Social media platforms, optimized for engagement, amplify precisely these emotional signals. Simply put, content that triggers affect spreads, and content that triggers reflection does not.
The result is a dopamine economy of politics: persuasion is selected not for truth, but for emotional reward and social signaling.
Social Media Didn’t Invent Propaganda. It Industrialized It.
Every mass medium reshapes persuasion. Radio favored demagogic oratory. Television favored charisma and visuals. Social media favors outrage, identity signaling, and rapid narrative cascades.
The key shift is decentralization. Propaganda is no longer just top-down messaging from elites. It is memetic, iterative, and algorithmically selected. Elites still matter, but networked amplification determines which narratives survive.
Marshall McLuhan’s insight that “the medium is the message” applies directly here: media architectures shape political cognition, not just political communication.
👉 https://kylesaunders.substack.com/p/marshall-mcluhan-and-the-political
From Propaganda to Authoritarianism
Propaganda alone does not produce authoritarianism. Democracies are saturated with persuasion. What matters is institutional constraint and narrative competition.
Authoritarian systems differ in three structural ways.
First, narrative monopoly.
In democracies, propaganda is pluralistic. Competing elites, parties, media organizations, and civil society actors contest narratives. In authoritarian systems, the state suppresses competing frames. The goal is not persuasion alone but the elimination of alternative interpretive lenses.
Second, coercive reinforcement.
In democratic politics, persuasion is largely decoupled from direct coercion. In authoritarian systems, propaganda is fused with sanctions—legal, economic, or physical. Belief becomes instrumentally rational because disbelief carries penalties.
Third, epistemic centralization.
“Epistemic” simply means how societies decide what counts as knowledge, truth, and credible authority. Democracies distribute truth-arbitration across institutions—courts, universities, scientific bodies, media, elections. Authoritarian systems centralize it. The regime becomes the ultimate arbiter of truth.
Historically, Maoist China’s “mass line” was propaganda as participatory mobilization—but always within Party-defined boundaries. Soviet and contemporary Russian narratives about “color revolutions” frame popular protest as foreign conspiracy, delegitimizing dissent and justifying repression. The propaganda content varies, but the structure is the same: narrative monopoly + coercion + centralized truth authority.
The danger is not that democratic elites frame issues. The danger is when framing becomes fused with coercion and epistemic closure—when dissent becomes not merely wrong but is deemed illegitimate.
Epistemic Legitimacy (In Plain English)
Across several recent posts, I’ve been circling a related theme: epistemic legitimacy, or who we trust to arbitrate reality.
In We Live Inside the Experiment Now
👉 https://kylesaunders.substack.com/p/we-live-inside-the-experiment-now
I argued that modern institutions are being stress-tested by rapid technological and social change, and that trust in shared truth is eroding faster than institutional capacity to adapt.
In Marshall McLuhan and the Political Subject
👉 https://kylesaunders.substack.com/p/marshall-mcluhan-and-the-political
I argued that media environments shape political cognition itself—what kinds of arguments, identities, and institutions feel plausible.
And in Artificial Intelligence and Politics
👉 https://kylesaunders.substack.com/p/artificial-intelligence-and-politics
I suggested that institutions are increasingly tempted to outsource contested judgments to AI systems, not because AI is normatively superior, but because human legitimacy is in the process of collapsing.
Propaganda is not the crisis. The crisis is that we no longer agree on who gets to say what is true—or why we should believe them.
When that collapses, propaganda becomes harder to distinguish from ordinary persuasion—and ordinary persuasion then begins to look like propaganda.
Tribal Psychology in High-Engagement Political Environments
Political psychology increasingly suggests that extreme moralized activism selects for certain personality traits—narcissism, Machiavellianism, and psychopathy—sometimes called the “dark triad.” These traits are associated with status-seeking, manipulation, and low empathy.
Social media platforms reward precisely these traits: high-arousal moral content, performative outrage, public shaming, and dominance signaling. Moral conflict becomes engagement. Engagement becomes status. Status becomes power.
Steve Stewart-Williams has written about how political polarization exaggerates enemy threat and entrenches tribal misperception, creating feedback loops of moral hostility and distorted social perception:
👉
This dynamic is not partisan. It is human psychology interacting with modern incentive structures, and in some circumstances it is weaponized against even highly informed political actors.
To be clear, this does not mean most activists fit this profile. It means platform incentives can disproportionately reward those who do.
We Are All Being Propagandized
The uncomfortable conclusion is that propaganda is ubiquitous. Elites frame narratives, institutions defend legitimacy, activists moralize, media chases outrage, algorithms amplify it, and individuals rationalize what fits who they think they are.
None of this requires conspiracy. It emerges naturally from incentives, institutions, and human psychology.
The danger is not that “everything is an op.” The danger is that nothing is trusted—including institutions that historically helped stabilize disagreement—so everyone starts looking for a “psychological operation,” a “psyop,” or simply an “op.”
So What Do We Do?
There are no silver bullets. But there are things that keep the system from tearing itself apart.
Democracies need institutions that can still function as referees, even when half the country is furious at the call. They need friction—speed bumps on viral cascades that turn ordinary rumors into moral panics. And they need restraint: citizens and elites resisting the temptation to treat every policy disagreement as an existential emergency.
They also need citizens who are, at least occasionally, willing to entertain the possibility that their side might be wrong.
The danger is not propaganda. The danger is propaganda in a world where no one trusts any referee and every claim is assumed to be an op.
The Deeper Problem: Truth Without Arbiters
So, we live inside the experiment now, and the more conscious we can become of that fact, the more we can see the manipulation.
Media environments are accelerating. AI systems increasingly mediate knowledge. Institutions seek algorithmic objectivity to stabilize contested norms. Citizens navigate identity-driven belief loops.
Propaganda is not necessarily the crisis. The problem is not that we disagree about facts. It’s that we no longer agree on who gets to decide what counts as a fact.
Democracies survive propaganda when propaganda is pluralistic and constrained. They fail when narrative control becomes monopolized—or when trust collapses so completely that every claim is treated as manipulation.
We are not there yet. But the incentives are moving in that direction.
Why This Matters for Democracy
To reiterate, democracy is not the absence of propaganda. Democracy is the presence of counter-propaganda, institutional contestation, and procedural legitimacy.
Democratic systems assume disagreement. They assume elites will persuade. They assume citizens will be biased. What they cannot survive is epistemic collapse—a world where no institution, process, or procedure is trusted to resolve disputes about reality.
When that happens, persuasion slides toward coercion, legitimacy toward raw power, and politics toward factionalized truth regimes. Authoritarianism does not begin with propaganda. It begins when propaganda becomes the only story allowed—and when no one believes there is any neutral process left to decide what is real.
Further Reading (and I’ve got whole syllabi if you want ‘em!)
Jacques Ellul, Propaganda: The Formation of Men’s Attitudes (1965).
A classic sociological and psychological theory of propaganda as a structural feature of mass society.
https://en.wikipedia.org/wiki/Propaganda:_The_Formation_of_Men%27s_Attitudes
Anthony Pratkanis & Elliot Aronson, Age of Propaganda (1992/2001).
A social-psychological deep dive into persuasion techniques used by advertisers, politicians, and demagogues.
https://archive.org/details/ageofpropagandae00prat
Edward Bernays, Propaganda (1928).
The founding text of modern public relations and elite opinion management.
https://en.wikipedia.org/wiki/Propaganda_(book)
Dan Kahan et al., “Motivated Numeracy and Enlightened Self-Government.”
Seminal work showing that cognitive sophistication can increase polarization under identity pressure. https://culturalcognition.net
Brendan Nyhan & Jason Reifler, “When Corrections Fail” (Political Behavior, 2010).
Classic evidence on why factual corrections often fail to change misperceptions.
https://link.springer.com/article/10.1007/s11109-010-9112-2
Steve Stewart-Williams, Imaginary Enemies (Substack).
A readable synthesis on polarization, tribal misperception, and moralized politics.
ADDENDUM 2 FEB 26
A Note on Evidence and Overstatement from above.
A fair critique of the propaganda/misinformation literature—and one raised in response to the earlier post by Ben Tappin—is that some early, attention-grabbing findings that I cited have not replicated cleanly. That was sloppy on my part, so I thought I’d better correct the record.
The most prominent example is the so-called backfire effect, originally reported by Brendan Nyhan and Jason Reifler. In their early work, corrections sometimes appeared to increase misperceptions among strong partisans:
Brendan Nyhan and Jason Reifler, “When Corrections Fail: The Persistence of Political Misperceptions,” Political Behavior 32, no. 2 (2010): 303–330.
https://link.springer.com/article/10.1007/s11109-010-9112-2
Subsequent research, using larger samples and improved designs, has found that this effect is rare rather than typical. In many contexts, factual corrections do move beliefs in the direction of accuracy:
Thomas Wood and Ethan Porter, “The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence,” Political Behavior 41, no. 1 (2019): 135–163.
https://link.springer.com/article/10.1007/s11109-018-9443-y
Similarly, some of the strongest interpretations of Dan Kahan’s early work on numeracy and polarization have been refined over time. The core finding—that identity can shape how evidence is interpreted in politically charged contexts—remains well supported, but the most dramatic claims about cognitive sophistication increasing polarization are now understood to be highly conditional.
Dan M. Kahan, Ellen Peters, Erica Cantrell Dawson, and Paul Slovic, “Motivated Numeracy and Enlightened Self-Government,” Behavioural Public Policy 1, no. 1 (2017): 54–86.
https://www.cambridge.org/core/journals/behavioural-public-policy/article/motivated-numeracy-and-enlightened-selfgovernment/7E3C6E6E0D88FBBEC6A2AAB9E1E7E8A6
The broader takeaway is more modest—and more important. Political cognition is conditional. Sometimes accuracy dominates. Sometimes identity, incentives, and social context dominate. The question is not whether corrections ever work, but when they do, and when they fail to discipline belief.
That conditionality is exactly what makes modern propaganda and activist tactics so effective in high-salience, identity-laden environments.



the smartest dogs are the easiest to train.
Great read - I'm focused on where you end here... Do you think the collapse theory (so to speak) could be limited to a view of the near future that is simply just unknowable? For example: Has the light in the fridge collapsed, or is the door just closed? By that, I mean - if the structures that we held for truth are no longer valid, how true were they in the first place?