Member Login Become a Member
Advertisement

Mechanized Propaganda: The Automation of Information Operations and Implications for U.S. Defense Doctrine

  |  
02.02.2026 at 09:05pm
Mechanized Propaganda: The Automation of Information Operations and Implications for U.S. Defense Doctrine Image

Mechanized Propaganda: The Automation of Information Operations and Implications for U.S. Defense Doctrine is by Brad N. on Linkedin. This piece was featured in Salve Regina University’s Active Measures Newsletter No. 367.

Subscribe to Active Measures from the Pell Center at Salve Regina University. Active Measures is a weekly newsletter and podcast of political warfare, influence, and information campaigns. Visit the Active Measures archive to browse previous top articles of the week.

Below is the Author’s Note, Executive Abstract, and Abstract.


Author’s Note: Why I Wrote This

This article began with a return to an old manual: FM 3-05.301 / MCRP 3-40.6A – Psychological Operations Tactics, Techniques, and Procedures. Chapter 11, “Propaganda Analysis and Counterpropaganda,” opens with a 1939 warning from Lee and Lee that propaganda is “a new means for rendering a country defenseless before an invading army,” and goes on to state that future adversaries are more likely to subvert U.S. foreign policy through sophisticated propaganda—on their own populace and on international audiences—rather than confront the United States through traditional military means.

We are now living inside the environment it predicted. The tools have changed—AI systems, platform algorithms, and automated visibility regimes—but the core insight has not: propaganda is an instrument of aggression aimed at cognitive terrain long before physical contact.

SMMA: When the Firewall Dropped

In the Pentagon’s own language, these are irregular warfare effects. DoDI 3000.07 (2025) defines irregular warfare as indirect, non-attributable, asymmetric campaigns and places MISO and operations in the information environment inside that frame, with explicit guidance to integrate social and behavioral science and AI/ML-enabled decision tools.

I wrote this piece because much of our doctrine still treats propaganda as messaging, while the systems shaping it have become mechanized, predictive, and autonomous.

My aim here is not to argue politics or point fingers, but to map the architecture: how the classical five tasks of propaganda have evolved into an AI-enabled kill chain, how perimeter defenses have become interior governance, and how coherence and ontology—not just facts—are now contested terrain.

This article is written for practitioners across defense, policy, and technology who are already working in this space but may not yet have a shared language for what they’re seeing. If it does its job, it will give commanders, analysts, and builders a clearer mental model for the battlespace they are actually operating in—and a starting point for the doctrine that will have to follow.

Executive Abstract

Propaganda in 2025 has evolved from persuasive messaging into mechanized preconditioning: AI-enabled systems that shape what populations are likely to perceive, trust, and act upon before a message ever becomes “content.” While legacy doctrine treats propaganda as a contest of narratives and rebuttals, the modern environment is governed by predictive models, algorithmic visibility regimes, and autonomous dissemination architectures that operate at sub-second latency and at population scale. The result is a shift from influencing opinions to engineering the conditions of belief formation.

This paper reframes FM 3-05.301’s Chapter 11 “Propaganda Analysis and Counterpropaganda” for the AI-saturated battlespace by updating the classical five-task framework—Detection, Analysis, Vulnerability Assessment, Counterpropaganda Development, and Dissemination—into an AI-augmented operational chain. Detection becomes pre-viral interdiction and latent signal identification; analysis becomes behavioral forecasting rather than semantic parsing; vulnerability assessment becomes high-resolution psychosocial mapping using aggregated data; counterpropaganda shifts from reactive messaging to upstream narrative inoculation; and dissemination becomes a contest over visibility as terrain, where attention and repetition substitute for verification. These evolutions compress the time available for human deliberation and render traditional counter-messaging increasingly symbolic unless paired with architecture-level interventions.

The paper argues that these dynamics drive a structural condition of perimeter–interior convergence: tools developed for foreign influence defense and information operations inevitably modulate domestic discourse due to shared platforms and channels. This produces a governance risk where resilience efforts can drift toward narrative control without explicit firewalls. To describe the strategic endpoint of mechanized propaganda, the paper introduces ontological warfare—competition over what is allowed to be believable—where coherence becomes more valuable than accuracy in saturated environments. In this context, influence campaigns increasingly target the “reality stack” (events → mediated perception → collective narrative), aiming to dominate the mid-layer of visibility and framing to determine what interpretations feel plausible.

The central doctrinal implication is the Coherence Imperative: defense leaders must treat cognitive terrain stability as an operational requirement, not a communications afterthought. The paper concludes with practical imperatives for practitioners across defense, policy, and technology: instrument doctrine for AI parity with explicit oversight constraints; treat visibility regimes and platform chokepoints as C4ISR-relevant terrain; adopt proactive inoculation models bounded by law and ethics; and prioritize coherence restoration as a resilience function to prevent mechanized fragmentation from becoming the default operating environment.

Abstract

In the digitized battlespace of 2025, propaganda transcends traditional messaging paradigms, manifesting as an autonomous, predictive apparatus engineered to precondition cognitive environments prior to kinetic engagement. Drawing on the prescient observations of Alfred McClung Lee and Elizabeth Briant Lee (1939), who characterized propaganda as “a new means for rendering a country defenseless before an invading army,” this analysis elucidates doctrinal imperatives for defense practitioners. The classical five tasks of propaganda analysis—Detection, Analysis, Vulnerability Assessment, Counterpropaganda Development, and Dissemination—persist as foundational constructs, yet their operationalization has been irrevocably altered by artificial intelligence (AI), algorithmic architectures, and large-scale data ecosystems. This paper interrogates these evolutions, foregrounding the convergence of foreign influence operations with domestic narrative governance, and posits ontological warfare as the emergent paradigm. Ethical ramifications, including the algorithmic amplification of the spiral of silence, demand immediate integration into military planning. Recommendations emphasize adaptive doctrine to safeguard cognitive sovereignty and frame ontological warfare—the contest over what is allowed to be believable—as the emergent paradigm.

About The Author

  • SWJ Staff searches the internet daily for articles and posts that we think are of great interests to our readers.

    View all posts

Article Discussion:

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments