Small Wars Journal

Servicemembers Must Navigate the Sea of Disinformation

Sun, 05/29/2022 - 11:19am

Servicemembers Must Navigate the Sea of Disinformation

By Lieutenant Nicholas H. Kacavas

What do malign foreign influence campaigns and sea turtle barnacles have in common?  Just as some barnacles exploit the wounds of sea turtles and threaten their survival, disinformation campaigns also exploit societal divisions or ‘wounds,’ for strategic interests.

Disinformation campaigns psychologically target civilian and military audiences, in order to produce disruptive effects like subversion and inciting violence. From a national security standpoint, this is disconcerting, especially since nearly one in five of the January 6th Capitol rioters possessed a military background—three times the rate by which they are represented in society. As such, parasitic barnacles best symbolize the destructive nature of disinformation campaigns that specifically target United States servicemembers and veterans. While much media attention has been given to Chinese and Russian-sponsored disinformation campaigns targeting United States military members, Iran is also expanding its influence.

      Disinformation Campaigns Targeting Servicemembers

Acting under the guise of a Facebook group called Tortoiseshell, Iranian state actors committed a series of attacks against members of the Department of Defense in 2021 by using “sophisticated, fake online personas…in order to infect their computers with malware and extract information.” These attacks were attributed to Iran’s Islamic Revolutionary Guard Corps because the malware was programmed by Mahak Rayan Afraz, a Tehran based company affiliated with the Islamic Revolutionary Guard Corps. Overall, American officials have been more preoccupied with Russian President Vladimir Putin’s interference with U.S. elections, and less focused on Iran’s deliberate attacks against the United States military.

Legislative Action

In response to Russian interference in the 2016 U.S. presidential election, Senator Amy Klobuchar sponsored a bill called the Honest Ads Act. This bill was designed to essentially convert the ad regulations already imposed on TV and radio onto social media. Paid internet and digital communications, under this bill, qualify as public communication and are thus, susceptible to the same regulatory requirements. 

Additionally, the California state legislature introduced Senate Bill 830 which requires the California state Department of Education, “to make available to school districts on its Internet Web site a list of resources and instructional materials on media literacy, as defined, including media literacy professional development programs for teachers.” The decision to pursue this legislation was based on a study by a Stanford University student who found that teen-aged students struggled to differentiate sponsored content from legitimate news stories. While misinformation does not have the malicious intent of disinformation, the larger controversy is the idea of content moderation by another entity for information consumers.

    Struggling against the rising tide

The design and lifecycle of disinformation, typically starting with a source mixed with some kernels of truth, then circulated in a medium, and delivered to the recipient audience, makes it hard to isolate and control the content.

The United States is not alone, however, in struggling against a rising tide of disinformation. The European Union is also exploring measures to counter disinformation campaigns. For instance, the European Commission has produced a series of initiatives like the Code of Practice on Disinformation, which established a universal set of regulatory standards. The Commission has also invested time and resources into the European Digital Media Observatory and European Democracy Action Plan which outlines “obligations and accountability of online platforms in the fight against disinformation.” In 2021 the U.S. Congressional Research Service published a report on Content Moderation Issues for Congress, which ultimately concluded that no legislative action should be taken. Based on stakeholder analysis, many concerns regarding the amendment of Section 230 of the Communications Decency Act arise because of censorship and free speech. Moderating this emerging issue will clearly be a fragile balance between censorship and the differences between misinformation and disinformation.

Looking ahead, Elon Musk’s likely purchase of Twitter and his evolving vision for the platform’s terms of service could have reverberating effects on the digital ecosystem of disinformation targeting servicemembers. Because information exists in an ecosystem–much like the sea turtles and the parasitic barnacles–it is important for servicemembers to remain vigilant and guard against malign foreign influence and disinformation campaigns.


The views expressed are those of the author and do not necessarily reflect the official policy or position of the Department of Defense, U.S. Army, West Point, or the U.S. government.

About the Author(s)

Nicholas H. Kacavas is a Second Lieutenant in the United States Army. He earned a BA in American politics and French from West Point, the United States Military Academy.