A Bodyguard of Lies

As Churchill drolly observed, the truth is so precious that it should always be protected by a bodyguard of lies.
Summer 1943, in a nod to this maxim, Operation Bodyguard was initiated in order to deceive German intelligence into believing a false narrative for the allied invasion of Northwest Europe. Although there are disagreements amongst historians as to the impact of the deception operations that were conducted under the Bodyguard umbrella, it is worth considering how the United States might employ similar techniques in a future large scale combat operations (LSCO) environment.
Ideally, deception operations should be constructed around central narratives so that, through their connective tissue, certain falsehoods can gain the sheen of truth by way of repetition. This framework ensured that even as some ruses were uncovered or discounted by the enemy, a confirmation bias was inculcated in the German intelligence and leadership that would negatively impact their response to the actual Operation Overlord plan. The core fiction of the July invasion at Pas-de-Calais was still being successfully sold to the Germans over a month after the landings at Normandy.
If the US is to leverage deception operations in future LSCO then it will need to create similar unified narratives. Successful deception requires not only the ability to mislead the enemy, but to direct what he believes and, as far as possible, how he reacts to it. Giving the enemy multiple competing narratives risks having him latch onto the wrong one. Additionally, per the Central Intelligence Agency’s ‘Deception Maxims’, deception operations need to be postured to be either ambiguity-reducing, or ambiguity-enhancing. It is important to recognize the difference between the two. Deceptions to reduce ambiguity serve to create certainty about the reliability and accuracy of a belief, encouraging the enemy to confidently follow the wrong course of action. Conversely, ambiguity-enhancing deceptions serve to sow doubt in the mind of the enemy, undermining his trust in real information or inducing inaction through uncertainty, timidity, or the need to expend time or other resources towards gaining better clarity.
As the US came out of World War II, there were under 30 million telephones in the country, all of them landlines. Today there are over ten times that number, with the majority carried on the person and capable of transmitting audio, video, and vast quantities of data across the globe in seconds. A feature of this new global information environment is the colossal amounts of information available, often with little to no delay or filtering. As anyone who has completed their annual Operational Security (OPSEC) training can attest, the risk that this can present is significant. From another perspective, however, this information spillage can also be leveraged in the service of deception operations.
The low barrier to creating such notional elements and photographic evidence of their existence underlines the potential for building ghost forces in the cyber domain that can be used to create friction for the enemy intelligence gatherers, who must then direct real-world assets to confirm or deny the element’s existence and hold combat forces in reserve until the existence is disproven.
In 1943, the British Fourth Army, headquartered at Edinburgh Castle and comprising of II Corps and VII Corps was tasked with the invasion of Norway (Operation Tindall); it would again be tasked with the same mission in 1944 (Operation Fortitude). Although the British Fourth Army had a celebrated history in The Great War, its combat role in World War II would be greatly reduced owing to it existing in a purely notional sense as part of Bodyguard. Beyond a small staff, primarily comprised of radio operators to create a credible amount of radio traffic and members of the Twenty Committee (a sly nod to the Roman Numeral equivalent) who ran the British double agents, Fourth Army never existed. As vital as the false reports and radio chatter were, equally important to maintaining the deception were the social media messages of the day. Local newspapers carried wedding banns announcements for soldiers and their sweethearts, results of soccer games between teams formed by the soldiers were reported over the radio, and details of local events that had been supported by army units were duly covered and used to produce evidence of the Army’s existence that could be sent back to the Germans to pore over and draw conclusions from about troop locations and sizes. Although Operation Tindall was a failure – Germany correctly assessed that the Allies lacked the capability for the invasion – the existence of the Fourth Army was accepted as real, leading to German intelligence’s assessment that the Allies had significantly more divisions in the British Isles than was the case. This may have played into Germany’s decision, in response to Operation Fortitude, to station 13 Divisions in Norway, greatly impacting their ability to provide sufficient coverage throughout France and the Benelux countries and limiting their ability to counterattack the Operation Overlord landings.
Ironically, a modern equivalent of this ploy has already found its place in the current conflict in Ukraine. In early 2023, eight decades after the daily lives of Britain’s Fourth Army soldiers had been shared through the media of the day, the Ukrainian Army’s 88th Mechanized and 13th Jager Brigades were posting photos on Facebook of their brave soldiers and the equipment with which they would push back the Russian invaders. Suspicious of their unconventional numbering, journalists reached out to the Ukraine Army’s General Staff, who quickly discounted the existence of these units. Theories posited for the basis of the Facebook pages included psychological operations, financial scams, or lower echelon elements unofficially embracing self-appointed names and emblem for their own reasons. Whatever the case, the low barrier to creating such notional elements and photographic evidence of their existence underlines the potential for building ghost forces in the cyber domain that can be used to create friction for the enemy intelligence gatherers, who must then direct real-world assets to confirm or deny the element’s existence and hold combat forces in reserve until the existence is disproven. A carefully constructed deception operation that embedded agents as Public Affairs Officers with real-world friendly forces, could, with suitable patches and vehicle marking overlays, create ‘evidence’ of larger forces than are actually employed in a given theater without the need for artificial intelligence (AI) or photo manipulation. With location spoofing and suitable background selection, these ghost forces could potentially be positioned anywhere in the world. Additionally, real world forces could be stood up using the insignia of these ghost units once the enemy is known to have dismissed them as fake, allowing confirmation bias to diminish the enemy’s trust in any evidence spillage that might give them away.
On the home front, assuming our enemies in future wars will monitor online media for any potential information sources, the US must be alert to the need to prepare the information battlefield well before hostilities occur. In today’s world, we should anticipate that, much like the British Fourth Army’s wedding bands and soccer results, the minutia will be used to extrapolate potential intelligence. A social media feed purporting to be of a military spouse or child will be given far less credence if it only started posting shortly before war is declared (and will face a harder time getting noticed by potential enemy agents); ideally, it should have a long enough pedigree (with perhaps a different focus and only obliquely referencing the military family life) that enemy agents will be aware of it – and following it – long before it is needed for deception operations. Creating such accounts with a view to sowing disinformation in a future war has the potential to be controversial. Already a request by the Joint Special Operations Command (JSOC) for funding to create “convincing online personas for social media platforms, social network sites, and other online content” has caused concern, with one commentator asserting that such technology usage has “no legitimate use case besides deception”. We are happy to agree with their assessment, even as we disagree with their opposition. Sowing the fields of social media with numerous feeds that will be of interest to our enemies now will provide a ripe crop of avenues to spread misinformation in times of hostility. Additionally, such a strategy will provide a measure of protection for the real accounts of family members that might be targeted. Ceding victory in deception to our adversaries to accommodate sensibilities is akin to former Secretary of State Henry Stimson stating, “Gentlemen do not read each other’s mail” and dissociating the State Department’s funding of the Cipher Bureau in 1929.
Open Source Intelligence (OSINT) as an intelligence discipline is new enough to still be refining its guiding policies and building its capacity. It collects information in the public domain, such as social media or published media like the British Fourth Army’s soccer scores, for use as intelligence information. It has passive and active components in how it collects information proving a valuable component of deception operations. It collaborates with cyber operations and is the modern equivalent of sending a spy behind enemy lines, though this spy is made of electrons and the cost of being discovered is more likely to be embarrassing rather than the life of the spy. There are reasonable concerns with OSINT and cyber active operations, but their value as a force multiplier is worth enabling, particularly as adversary nations compete in that space.
The purpose of spies (whether comprised of flesh or electrons) is to provide information to confirm or deny planning assumptions, and from there inform the strategic leadership’s decisions about possible courses of action. Such an undertaking has two critical components – positioning suitable agents (real or virtual) so that they can harvest the information and determining the relevance and accuracy of the intelligence gathered.
One of the critical successes of British World War II espionage (at least as important as cracking the enigma machine) was the early identification of agents sent by Germany to spy upon the nation. Recognizing that simply neutralizing the agents would only encourage the Germans to send more, possibly more capable, agents, British intelligence instead elected, where possible, to turn the spies and have them operate as double agents in the service of the Crown.
Assuming our nation’s enemies already have agents in place, then there is value in identifying any that are capable of being employed for our own ends. This may involve inducing them to actively work as double agents for us but could also be as simple as feeding them accurate but minimally useful intelligence through apparently poor OPSEC in preparation for one day using them as part of a deception operation. In a reverse of ‘catfishing’, it may even be helpful to create a disgruntled, careless, or compromisable service member or vulnerable family member for them to befriend or entrap online. Such contact should not be too easy for them to build a relationship with. Information should never be too easily available. A perverse trait of human nature is to equate exertion with reward; the greater the cost in terms of effort or resources that the enemy has had to expend in obtaining the false information, the greater their incentive to champion (to themselves and to their handlers) its certainty in order to justify their hard work.
In 1941, a critical agent for German intelligence in the UK was a mid-level Spanish official, codenamed “Alaric”. Alaric continually and spectacularly proved his worth, building a large network of valuable agents and informants and providing highly respected intelligence. Unbeknownst to German intelligence, however, “Alaric” was, in truth, one Juan Pujol Garcia, a fervent anti-fascist and his network of spies was fictional. Among the techniques Pujol and his handlers utilized was providing accurate troop information that was too late to action a response to, alongside false narratives about the ‘real’ invasion, that would be undertaken the following month. So successful was the ruse, that studies of German intelligence reports afterward found they praised Pujol’s work as highly important and accurate, contributing to the Germans holding over twenty divisions in reserve to counter this second invasion, and providing the Allies with the time to solidify their hold on their beachhead at Normandy.
Pujol’s concocted network of spies was a boon to the Allies’ counterintelligence operations. Most obviously it offered a conduit for feeding the false narrative to the enemy that was already highly trusted by their intelligence services, while also syphoned resources away from the Germans, who rewarded Alaric and his network with several hundred thousand dollars in payments and expense reimbursements. Just as critically, however, it filled key needs – multiple well-placed agents and robust intelligence gathering – that the Germans would otherwise have expended the necessary effort to create.
A stimulating exercise for anyone interested in deception operations is how might the US, absent a self-starter like Pujol, set conditions for creating a similar fake spy network to take control of the enemy’s intelligence gathering operations in a future LSCO environment (or even during operations below the threshold of war) as a counter to them creating their own. Perhaps if Aldrich Ames had expended his energy creating such a network (using it to feed false or minimal use intelligence to his Russian handler) he might have been able to secure, through payments for all his agents, even greater remuneration in the service of his nation rather than in its betrayal.
Like nature, intelligence gathering abhors a vacuum. Had the British focused on preventing their German counterparts obtaining valuable information then it might have only encouraged them to try harder. Instead, by deciding what information, both real and false, the double agents provided and weaving it into a greater narrative supported by planted information Germans collected by other means, the British spymasters behind Operation Bodyguard were able to manipulate the German commanders in their decision making for the defense of the European mainland. This was particularly effective when the false information fed into what the German planners already believed. By Spring 1944, German Intelligence strongly believed there were far larger forces stationed in the British Isles preparing for the invasion than was actually the case. Further, they were confident that the main invasion was planned to take place at Pas-de-Calais in July of that year. Any information that confirmed this belief was given greater credence than that which went against it.
Given that any future large scale combat operation is likely to be preceded by an extended period of competition below the threshold of armed conflict, the question of when the US can begin laying the foundations of its wartime deception narrative is critical.
In deception operations this is referred to as Magruder’s Principle and posits that it is easier to use deception to reinforce pre-existing notions and beliefs than to change them. If we are to successfully employ this principle in future wars, then there is value in identifying the greater scope narratives we want to have in place now, both to determine how we might feed pre-existing erroneous notions about the US and its capabilities and set conditions, if needed, to create and nurture new ones.
Jones’ Dilemma suggests that the more sources that a deception target can access to confirm or deny the veracity of information, the harder it will be to deceive them. Turning this around, one can, through the creation or manipulation of multiple sources, create a web of falsehoods that reinforce each other in support of the desired narrative. With multiple avenues to feed untruths to German intelligence, the British were able to create false narratives, such as the date and location of the invasion, that made any accurate information appear to be the deception. The mounting intelligence collected via different messages reinforced the credibility of the deception. As a result, and with conditions set through confirmation bias, even with reports from German units stationed in Normandy providing both accurate and trustworthy information about the size and disposition of Allied forces at and after Overlord, it was still possible to sell the narrative that this was a feint.
This may seem to offer a far greater challenge in the current information rich environment. The British Fourth Army’s Scottish base of operation was sufficiently remote for the Germans to be unable to properly assess its nature, and so they relied upon the radio traffic, newspaper stories and (false) reports from their compromised agents in the country. As a result, it was possible to craft a plausible fiction of multiple additional divisions standing ready to invade the European mainland. Further south, decoy and dummy units, such as fake tanks, provided visual evidence of Allied troop levels and locations that only needed to withstand the scrutiny of enemy aircrafts. For modern US planners seeking to deceive the enemy, such a scenario seems a luxury; a decoy tank is far less effective if you have people posting ‘selfies’ with it. Additionally, as an open and accountable democracy, the US cannot summon notional legions of steely eyed killers ready to defend its shores to deter its enemies without also lying to its citizens. As referenced earlier, even the creation of false online social media accounts by the Pentagon as a means to monitor or counter enemy falsehoods raises concerns, so any deliberate release of false information in support of future deception operations would, if uncovered, cause an uproar.
This creates an interesting conundrum. Assuming that the optimal situation is to avoid wars with peer or near peer threats, actively dissuading them from conducting any action that will risk our retaliation is desirable. To this end, having clearly known and acknowledged overmatch is a strong deterrence. A false narrative that our combat power is significantly greater than it is, would be both more effective and less expensive than what we might otherwise be able to field but would entail a far greater deception from the government or military than the US electorate would accept.
Lying is an integral part of the battle for hearts, minds, and morale in a time of conflict, armed or otherwise. As part of the British campaign to encourage the US to enter World War II, their intelligence services planted false stories in the American media, infiltrated pressure groups advocating to join the war, and created a fake map of Hitler’s plan to invade South America. The last of these they ensured landed on FDR’s desk, leading to him denouncing German’s designs on the Americas and against the US, which in turn may have precipitated German’s declaration of war. In essence, a foreign power used deception operations against the American people and government in their own homelands in order to pull the country into an armed conflict. If we do not hold it against ‘Perfidious Albion’ that they did this, should we not afford our own intelligence community the right to use similar means to set conditions for the nation to be best postured against peer threats now and in the future.
Given that any future large scale combat operation is likely to be preceded by an extended period of competition below the threshold of armed conflict, the question of when the US can begin laying the foundations of its wartime deception narrative is critical. If we wait until war is declared or bullets are flying, we are probably too late. With the existing conflicts around the globe that the nation is already on the peripheries of, the US would be prudent to recognize the wisdom of setting conditions to create or maintain advantages in the information dimension sooner rather than later. The bodyguards of lies need to start getting into position to protect America’s precious truth.
(Disclaimer: The work presented here reflects the opinions of the authors and does not reflect an official for the Department of Defense, Department of the Army, or any other agency of the United States Government.)