Small Wars Journal

Information Operations as Force Protection

Fri, 06/29/2018 - 12:20am

Information Operations as Force Protection

Jaim Coddington and Casey Lamar

In the 20th century, the industrial revolution allowed states to leverage existing technology for brutally effective new tactics on the battlefield. Rapid innovations in rifled artillery, mechanized fighting vehicles, and chemical warfare caught commanders of the day woefully unprepared and led to horrific casualties and collateral damage. Today, actors like Russia and the Islamic State occupy this disturbing, pioneering role in the realm of information warfare (IW), leveraging the reach of global telecommunications to influence target audiences and support kinetic operations on the battlefield. Some observers contend that the West should co-opt these new methods to avoid falling further behind in the information domain.  The combined threat of adversary information capability and the growing academic discussion around information warfare puts US doctrine at a crossroads between adopting the methods of its adversaries or establishing different methods to dominate the information domain. While much of the defense community’s planning, wargaming, and emerging doctrine focuses on counter-propaganda and targeting adversary IW capabilities, this reactive strategy may have neutral or negative results.

A proactive, and therefore better, strategy for combating threat IW should combine IW force protection and IW network engagement into an umbrella program that this paper calls IW buffering. Rather than meet IW threats head on, an IW buffering strategy would instead mitigate the effects of threat IW while focusing on undermining the sustainability of the threat IW capabilities. To illustrate the benefits of an IW buffering approach, this paper will analyze the structure and effects of recent ISIS and Russian IW campaigns and suggest how similar threats could be proactively mitigated in the future.

Case Study: Ukraine

Russian IW in Ukraine after Euromaidan was exceptionally effective. The combination of propaganda and misinformation stoked separatism and and disrupted a coordinated response to Russia’s invasion of Crimea during the first stages of the civil war. At first, policymakers and analysts seemed blindsided by the sophistication and scope of Russia’s IW campaign, and since 2014 considerable research has been devoted to Russia’s new IW capabilities. Much of this research singles out the disruptive roles of Russian state media or troll factories, but much less accounts for the sociopolitical context which made Russian IW so successful in Ukraine. The true genesis of Russian IW in Ukraine consists of long-term, grassroots ideological shaping; the gradual penetration of Russia’s vertically integrated state media among receptive Ukrainian audiences; and most importantly, a coordinated trolling effort which caused provocative rumors and misinformation to go viral in Ukrainian social media communities.

Russia’s long-term ideological narrative is the common thread uniting its information operations and its Ukraine policy. The concept of Eurasianism positions Russia as the direct successor to the Soviet Union and patron of smaller Soviet descendants like Ukraine. Eurasianist philosophy views Ukraine as a branch off the Russian tree, a quasi-state with questionable territorial integrity. Like the Monroe Doctrine, Eurasianism also gives Russia the justification to reassert itself in its periphery and intervene when necessary to protect its Russian-speaking “compatriots abroad”.  This supposed need to protect a threatened Russian minority formed the main pretext for all Russian intervention in Ukraine. The grassroots spread of Eurasianism in Russian-speaking populations in Ukraine, a key factor in the effectiveness of Russian media’s follow-on messaging after Euromaidan, was a years-long process which originated in the writing of Alexander Dugin in the 1990s. The Eurasianist-inspired separatist movement in Ukraine’s eastern regions, unlike the Euromaidan protests, did not spring up overnight – it was deliberately cultivated for years.

The same is true for the Russian state media apparatus. The development of a vertically integrated, state-directed national media can also be traced to Putin’s 2000 Information Security Doctrine, which places special emphasis on domestic information superiority. Later documents like the 2008 Foreign Policy Concept state Russia’s clear intent to “develop its own effective means of information influence on public opinion abroad.” As in the Soviet era, Moscow views the information environment as a contested space where national media should serve the interests of the state. In Russia, it does. The vast majority of the population gets its information from a state media constellation which is owned, manipulated, or subsidized by the government. This infrastructure includes traditional news agencies like TASS and RIA Novosti, fake news provocateurs such as RT and Sputnik International, and a number of state-sponsored TV channels whose distinctive flavors and quantity offer a semblance of plurality. The Kremlin even tolerates the survival of dissident media outlets like Dozhd and Lenta, allowing the political opposition to “let off steam” without significantly influencing public opinion.

However, while Russian state media also had a significant presence in Ukraine before Euromaidan, evidence falls short of supporting the widespread view that Russian media (alone) significantly influenced the behavior and attitudes of the average Ukrainian. In fact, the percentage of Ukrainians who trusted Russian mass media fell from 45.7% in March 2013 to just 12.7% in April 2014 - the Ukrainian population’s natural response to the proliferation of falsehoods in Russian media was to stop trusting Russian media. In 2013, television was the dominant source of news and information in Ukraine, with 96.8% of Ukrainians watching at least once per week. At the time, Rossiya 1 was the most popular Russian TV channel in Ukraine, but only ranked 12th in overall popularity with a 1.47% share of Ukraine’s national audience. The top-ranking Ukrainian channel, Inter, held a 14.11% audience share. The common perception that the average Ukrainian was brainwashed by pro-Russian propaganda in the early stages of the civil war is not supported by hard facts. While Russian media likely had a direct cognitive impact on Ukrainian populations with existing pro-Russian bias, statistics show that it proved far more useful in swaying Russian public opinion in favor of Putin’s aggressive policies in Crimea and the Donbass.

Russian state media may have given Russian military forces and their proxies a temporary tactical advantage at the beginning of the conflict by clouding the information environment and confusing Ukrainian audiences. The fog of uncertainty created by conflicting Russian and Western narratives about events on the ground made it more difficult for Ukrainians to form opinions about the crisis and take decisive actions like joining pro-Western protests and civil disobedience, volunteering for military service, voting for pro-Western parties, and helping Ukrainian government authorities track separatist activity. Trolling tactics on social media appear to have significantly amplified the effectiveness of traditional Russian media in spreading doubt and uncertainty. Known troll factories (also called “troll farms”) like the St. Petersburg-based Internet Research Agency have been operating in Russian since early 2013. These entities employ hundreds of personnel, professional “trolls” to infiltrate target audiences on social media and other online forums. Trolling activities include spreading provocative rumors, undermining the voices and rhetoric of dissident users, and attacking the credibility of perceived anti-Russian news. Ukrainians may have been more impacted by social media-based information because this information is typically shared among peer and family groups: Ukrainians tend to trust news from their immediate social circles – peers, friends, and family – much more than news from other sources. Therefore, even if most Ukrainians quickly recognize news from RT or Sputnik as garbage, if that same information comes from social media without direct attribution to the Russian state, it gains credibility in the eyes of the audience.

Taken as a whole, Russian information operations in Ukraine have been very effective among certain populations and in finite circumstances. But the Russian campaign to influence Ukraine and bring it into Moscow’s orbit has been a strategic, long-term effort. From an information warfare perspective, it may be useful to think of this effort in three phases: first, the development of Eurasianism as an ideology and its popularization in Russia and Ukraine throughout the late 1990s and early 2000s; second, Russia’s consolidation of its national media under state control and its growing media presence in Ukraine from approximately 2004 through 2013; thirdly, Russia’s development and implementation of troll factories and other clandestine IW tools from 2013 through the present day. This long-term campaign faced very little direct resistance from the West until 2013, when the European Union finally attempted to bring Ukraine closer to the European community. Russia had decades to establish its foothold in Ukraine’s cognitive sphere.

It is unlikely that Russia could replicate this kind of campaign anywhere in the world in the near future. The collective memory of the Kremlin’s falsehoods and aggression is still strong, and the ongoing appeal of Eurasianism in the former Soviet Union is not. Recent efforts like the IREX literacy campaign in Ukraine further inhibit the spread of disinformation. This program aims to train local journalists on reporting methods, ethical standards, and objectivity to promote facts-based news. It also helps educate citizens on information consumption techniques like fact-checking, seeking multiple sources, maintaining skepticism, and critical analysis of arguments. These new tools are like body armor and barbwire for the cognitive sphere: they are force protection from IW threats. IREX’s overall campaign goes one step beyond force protection with IW buffering: it builds resilience against disinformation in a vulnerable population and also engages that population as a network, particularly through the training of journalists who serve as information arbiters in that network. When applied, this kind of deliberate, proactive IW buffering could cheaply and sustainably protect US interests where unprotected populations are targeted by adversary IO.

Case Study: ISIS

The United States (US) military and its coalition partners showed an inability to counter or mitigate the challenge of the, ‘Islamic State in Iraq and Syria’ (ISIS) information capabilities during the ISIS 2014 expansion into Eastern Syria and Western Iraq. Such shortcomings should force US military planners to evaluate and adapt doctrine and training. Security academia and Congressional hearings currently focus on understanding and countering ISIS’s propaganda with expanded ‘Military Information Support Operations’ (MISO) that are coordinated with both state and non-state partners. This focus on MISO and counter-propaganda is a mistake if it isn’t linked to developing a stronger information warfare buffering system.

The power of the ISIS IW network was its combination of an advocate network that created tailored propaganda and the integration of that propaganda with military operations; thereby, making their message absorbing, comprehensive and easily accessible to their supporters. Whilst they vindicated, consolidated, informed and encouraged their supporters, ISIS simultaneously intimidated, agitated, confused and polarized their opponents.

ISIS achieved these effects through several lines of effort. First, the ISIS IW network used propaganda in digital echo chambers to consolidate support, garner recruits and develop an enterprise of jihadist activism. ISIS posted their content to forums and encouraged others to repost and modify the content. This tactic magnified the outreach of ISIS through crowd sourcing of the entire digital jihadist community. Second, the ISIS IW network was able to gain international relevance through sensationalization of their violent conduct. One of the most notable examples was the video of the burning alive in a cage of the Jordanian Air Force pilot, First Lieutenant Al-Kasasbeh. In a third line of effort, ISIS polarized their local opponents through targeted media that played on historic divisions and cultural biases. In particular, ISIS focused on the Shia-Sunni divide in Iraq and emphasized the need for Sunni’s to unite. In combination, these lines of effort allowed for the ISIS IW network to enable military expansion, promote consolidation of gains and to expand the political impact of military gains.

The ISIS IW system needed to be countered as a network. A network should be engaged on three levels: the structure of the network, the functions of the network and the sustainability of the network. In 2014, the ISIS IW network was structured in concentric circles. Content and strategy would be generated by core members. Beyond the core members, the ISIS IW network was an enterprise network of self-appointed activists and self-selecting recruits who would become a diaspora of advocates, content developers, content disseminators, recruiters and fighters. Those individuals at the center of the concentric circles often uploaded their content to digital forums where the content would take on a life of its own in the hands of self-appointed activists. Attempting to fracture this network would be relatively ineffective because it is a decentralized enterprise of anonymous users. Therefore, if the network can’t be fractured through targeting, it is vital to develop a strategy that looks to mitigate the immediate effects of IW functions while undermining the sustainability of the IW network.

The function of the 2014 ISIS IW network was focused on enabling expansion, consolidating gains and maintaining relevance as the preeminent jihadist organization. ISIS employed deception, intimidation and an enterprise of propaganda expansion to prepare the battle space for military expansion. A commonly cited example was the staged “defections” in Deir ez Zour Province, where ISIS infiltrated rival rebel groups to reinforce widespread rumors of ISIS defections. Such tactics were combined with parading the military capacity of ISIS prior to offensives. The combination led to the fracturing of the rival jihadist groups in the Deir ez Zour Province.

To consolidate gains, the ISIS IW network used a combination of intimidation and self-promotion. ISIS used public displays of force and brutal administration of justice to quickly intimidate populations under their control. This technique is documented in the testimonies of residents of Mosul who lived in fear of execution from roaming ISIS police squads. ISIS spun their behavior as a just application of Islamic Law. ISIS promoted their mercy, devoutness and strength on digital forums that became echo chambers of vindication that drew in new recruits and an enclave of digital advocates.

ISIS growing their network of support was predicated on their need to maintain relevance. To maintain relevance, ISIS needed consistent military expansion and direct conflict with the US and its allies. This goal was fundamental to their survival because they needed to outshine other jihadist organizations that competed for both recruits and resources. Once ISIS caught the attention of the international media and drew the US into direct conflict, ISIS made itself the center of anti-Western jihadism.

A concept of information buffering would first look to mitigate the effects of such functions on friendly and neutral networks. This begins with identifying the conditions or reactions that would be favorable to the opponent. The next step is to identify what critical factors in a network would have to shift in order to favor the threat network. These critical factors are associated with nodes in the friendly or neutral network that must be hardened against influence and disinformation. In the case of ISIS, force protection could follow  “Salam Shabab” as an evident model of success. This program focused on mending divisions in Iraq through uniting youth on a reality TV show. The show followed a peace building and education curriculum that eventually spurned an online community of peace advocates. This community was an active counterbalance to ISIS’s attempts to play on the Sunni-Shia divide. While this community was not decisive, it does demonstrate a model for developing “communities of trust” that could be used as an initial line of defense against threat IW networks. Such lines of defense could allow for a counter-threat information campaign to move beyond damage control and focus on undermining the sustainability of the threat IW network.

The ISIS IW network drew sustainability from the combination of its identity and network of supporters. ISIS’s core identity was to establish a Salafist utopian caliphate. The ISIS IW network functioned as the herald of this utopia. The enterprise of ISIS advocates took on this role and created a positive feedback loop where expanded support and relevance begot more support and relevance. This expansion was compounded by military expansion and consolidation around a tangible caliphate. ISIS became a symbol for a promised future where God’s kingdom on Earth triumphed over Islam’s perceived enemies.

Initially, the US and its allies attempted to undermine the credibility of this narrative by illustrating the hypocrisy and brutality of ISIS by entering into a direct counter-messaging campaign. For example, the State Department launched “Think Again Turn Away” campaign started open Twitter fights with ISIS advocates over which side was truly justified in their perspective. The State Department campaign gave relevance to otherwise fringe users of social media. This mistake played directly into the information strategy of ISIS. ISIS maintains its relevance by displaying its anti-establishment position on the global stage. It is sustained through its juxtaposition against the United States. A better method of undermining the sustainability of ISIS could be Google’s “Redirect” campaign. The “Redirect” campaign uses search heuristics to lead those who search for jihadist content to firsthand accounts that undermine ISIS’s Salafist utopian identity. The “Redirect” method recognized that the messenger is just as important as the message, and while it may be too early to tell if the campaign is successful, it does offer a new opportunity for success where direct counter-propaganda failed.

The 2014 ISIS IW break out teaches the importance of IW buffering as a mitigating tool to buy time in order to identify, assess, isolate and undermine IW threats. The US and its partners against ISIS needed to build up lines of defense against disinformation and influence through efforts like the “Salam Shabab” campaign. This would have bought time by hardening key nodes with “communities of trust.” These measures would have bought time to understand what sustained the threat IW network. Once the sustaining factors were identified, they could have been undermined clandestinely through methods like the “Redirect” campaign that used Boolean logic heuristics to feed content that undermined the credibility of ISIS’s Salafist utopia.


The common Western solution in Ukraine, the Middle East, and countless other IW battlegrounds was to try and prevent disinformation from reaching target audiences. This is a short-term measure at best; at worst, it amounts to systematic censorship and a betrayal of public trust. We live in a hyper-connected information environment and trying to intercept or block all disinformation is like building a dam in a monsoon. Instead, Western policymakers should treat IW as a force protection and network engagement issue.

IW buffering includes an in-depth IW force protection that mitigates the effects of threat IW and undermines the sustainability of the IW effort; simultaneously, IW buffering includes a network engagement plan to mitigate the effects and undermine the sustainability of the threat IW network relative to neutral or friendly networks. Because our adversaries’ information operations are continuous and persistent, so should IW buffering be sustainable and long-term. We must identify and foster initiatives to make IW target populations, including US citizens, more resistant to faulty logic, conspiracy theories, uncritical thought, groupthink, and cognitive dissonance. This proactive approach will make it more difficult for adversaries to gain a foothold in the cognitive sphere.

The opinions expressed in this article are the authors’ and not necessarily those of the U.S. Department of Defense or U.S. Marine Corps.

Categories: cyber warfare

About the Author(s)

Casey A. LaMar is a Marine Corps Ground Intelligence Officer. His research interests include information warfare, human network analysis, decision theory, and history of US foreign policy.

Jaim H. Coddington is a Marine Corps Air Intelligence Officer and Weapons and Tactics Instructor. His research interests include information and cyber warfare, global water policy, and all things Russia.



Sun, 04/16/2023 - 5:39pm

Nobotclick™ is a program to detect and protect against ad fraud and click fraud. Our click fraud prevention service protects Google Ads and Bing with industry-leading detection algorithms that automatically block fraudulent IP addresses. Our website is aimed to provide the actual information about different ways of protecting your business on the Internet. Several sections that are given on our website will help clients understand the main principles of the global web. It will be especially actual for those who want to start an e-business, or for those who have already done it. The articles given on the blog will help you understand what to pay attention to. Besides, the service itself can help you in saving money in your ads company.