Small Wars Journal

“The Only Problem with the Operations Planning Process is that We Don’t Use It!”: Why this Argument is Invalid

Tue, 05/28/2019 - 1:37pm

“The Only Problem with the Operations Planning Process is that We Don’t Use It!”: Why this Argument is Invalid

Aaron P. Jackson

Over the years I have heard time and again the argument made that “the only problem with the operations planning process is that we don’t use it!” Often this argument is made at a staff college by some “greybeard” or other, although I have also heard it made by other less venerable staff and on occasion even by students of such institutions.[1] It is usually offered as a reason why no reform, change, or update is required to whichever operations planning process is under discussion. The label “operations planning process” may therefore be replaced with “Joint Planning Process”, “Military Decision Making Process”, “Joint Military Appreciation Process”, etc. The list goes on, and the exact process under discussion is irrelevant to this particular line of argument.[2] Yet this argument, while often made, is invalid. The simple intent of this article is to explain why this is the case.

The Burden of Proof

The first reason this argument is invalid is because it has never been proven. As a result, it is based on opinion rather than fact. As of the time of writing of this article, I have spent half a decade treating this question as a hypothesis and seeking to test it. When someone makes this assertion, my first retort is generally: “how do you know that?” The almost invariable answer is “based on my experience…”. In other words, the observation is based exclusively on the anecdotal observations of the person making it. This is not research! As the saying goes, the plural of anecdote is not data![3] Frankly, my anecdotal evidence is as good as theirs, and I’ve seen planning processes used sometimes and other times not.

This leads one to question, what would it take to prove or disprove this argument? To paraphrase Clausewitz, the answer is one that is both simple yet also very difficult to achieve.[4] Simply, it would involve measuring how often the operations planning process is used relative to the number of times it could (or should) be but is not. Beyond that is where things gets difficult.

First, there is the matter of what counts as using the process. Does it count if the planning process is used in a staff college exercise where student’s planning outcomes are assessed against the doctrine? Does it count if the planning process is used on an exercise where the “enemy” is another part of one’s own force that is “playing red”? Does it count for contingency planning that may never get beyond a theoretical exercise? Or, as those I’ve heard making this argument often imply, does it only count if one is (or is not) using the planning process to develop a plan for an “actual” operation that is then executed in the “real” world? How one determines the extent of which circumstances do and do not count when seeking to verify this argument would not doubt lead to very different findings.

Assuming, as is often implied and occasionally stated, that only the latter of the above criteria is what counts in the case of the argument in question, brings us to the second difficulty in trying to prove it. That is, planning for actual operations usually occurs in operational or tactical headquarters, which almost invariably have restricted access. The plans themselves are usually classified. As a result, it becomes either very difficult or, more often, outright impossible for a researcher to access enough headquarters to conduct a sufficient amount of research to either prove or disprove this argument. And even if such access could be gained, the research findings would most likely be classified and therefore they could not be distributed widely enough to have any kind of noticeable impact.

As an alternative to accessing headquarters, researchers could instead conduct interviews with or survey personnel who had served within headquarters planning staffs, and ask them how often they had opportunities to plan and how frequently they did or did not use the operations planning process to do so. While more feasible than direct data collection inside different headquarters, this option is problematic because there would be no way to verify the statements made by interviewees without encountering the classification problem mentioned earlier. Such a research project could therefore provide at best an indication of the likely validity or not of this argument.

This limitation notwithstanding there remains another practical problem, which is one of time and expense. That is, the author is yet to find anyone interested enough in the outcomes of such a research project to fund it.[5]

Once all of these factors are taken together, it can be deduced that it unlikely that any data will be available to prove or disprove this argument any time soon. Until there is proof in support of it, however, the benefit of the doubt ought to fall in the negative. In other words, until the argument can be proven by substantive research, it ought not to be couched as the authoritative statement it is so often purported to be.

The Problem of Logic

The second reason why the argument that “the only problem with the operations planning process is that we don’t use it” is invalid rests not on whether or not it has been proven, but on the logic underlying it. Central to this reason is the answer to the question, why is this argument made at all? Generally, it is made to defend the operations planning process (or whichever equivalent process is under discussion) against criticisms that it is either suboptimal or outright unsuitable to achieving the purpose for which it was designed. Such criticisms are often accompanied by proposals about how the planning process ought to be improved, or more rarely by proposed alternatives to the process itself.[6] Making the argument that “the only problem with the operations planning process is that we don’t use it” allows these criticisms to be summarily dismissed as both misguided and irrelevant.

Let us ignore for a minute that the argument is unproven and instead assume that it is correct. If “the only problem with the operations planning process is that we don’t use it”, then one must ask why it is not being used. The answer must be that it is not fit for purpose – that is, the operations planning process would be used to conduct actual planning for operations in the real world if it was suitable for this purpose. That it is not used must be because it is not suitable for use in this way, or in this circumstance. One could offer some hypotheses as to why this might be so—for example, the process may be too lengthy to be used within the time available to conduct such actual planning; or the process may too complicated, causing military planners to find it inaccessible; or military planners may simply be too lazy to bother using it. Although there is some evidence to indicate the validity of the first two of these three hypotheses,[7] they nevertheless require further investigation before they can be substantively proven or disproven.

Regardless of the accuracy of these hypotheses, for the purposes of this article what is important is that if the operations planning process is not being used, then this is due to some problem or another with the process itself. Ergo, the argument that “the only problem with the operations planning process is that we don’t use it” cannot be valid. If this was the only problem, then there would be no problem and the process would be used. But if the process is not used, then there must be a reason why it is not used, which means that there is another problem besides it not being used. One cannot help but feel that Joseph Heller would be proud of such an argument, and of those who make it.[8] In summary, this argument is subject to the circular reasoning fallacy as well as to causal reductionism, and must be discounted due to these logical errors.[9]

Let us now play semantics. I am sure some would respond to my identification of this logical error by asserting that, “this is not the only problem. It is merely one problem, and therefore this article is wrong because the logical flaw it identifies does not apply”. My answer to those who may be thinking this is that, if this is true, then the argument admits the possibility of other problems with the operations planning process and therefore ought not to be used as a means to stifle or silence criticism of this process. Whether or not one uses the word only, if this argument is made in such a manner that it stops others from questioning the operations planning process or from highlighting its possible flaws, then the effect is the same. The precise wording of the argument becomes unimportant. Its use as a means to stop military practitioners from critically thinking about something that is at the core of their profession is what matters. If the argument results in this, then it has implied that a lack of use is the only problem with the operations planning process regardless of whether or not the word “only” has been explicitly stated.

Conclusion

This article prompts one to raise yet another question: why have the flaws in the argument that “the only problem with the operations planning process is that we don’t use it” been missed by those who make it? The simplest possibility is that they are focused exclusively on their own experiences, and merely have not taken the time to critically reflect on their own beliefs.

One suspects another possibility is more likely, however, which relates to the broader nature of military training and culture. An operations planning process is a tool, which is used to make sense of and attempt to shape the operational environment. This metaphor was made famous by Karl Weick, who used the failure of firefighters to drop their heavy tools, which would have enabled them to run faster to escape an approaching wild fire but who instead perished while still clutching their tools, as an allegory for organizational failure to adapt to changing circumstances.[10] The implication of this allegory is that knowing the limitations of one’s intellectual tools, and when to drop them in light of changing circumstances, is as important to organizational success as knowing how to use them.

Professional military education tends to focus on how to use the tools, but not on why they are used and when it might be useful to drop them. It has been suggested elsewhere that this tendency is due to the same cultural aspects that also constitute the strengths of military organizations, namely hierarchy, discipline, order and reliability under stress. To reinforce these highly-valued organizational traits, militaries focus on the use of the tools, not on why the tools are used. Euphemisms such as “the enemy has a vote” and “stay in your lane” have been identified as mechanisms that militaries use to achieve the effect of discouraging personnel from questioning their tools.[11]

In light of this situation, the argument that “the only problem with the operations planning process is that we don’t use it” can be seen as just one of many axioms used to discourage military practitioners from questioning the limitations of their tools. Yet the operations planning process, like any other tool, was designed with a purpose in mind and has its limitations and flaws as well as its uses and strengths.

Allowing the operations planning process, and why it is used, to be critically examined, will at the very least result in those using it developing a better understanding of it. Such critical examination may also lead to improvements in the process, or to its replacement by an innovative new tool that better achieves the same function. For military organizations that cherish their operations planning process as one of their favorite tools, this critical reflection can be a scary or threatening prospect. Yet it is worth confronting this fear because of the longer-term benefits that critical thinking about such cherished tools can yield. Already there has been some experimentation with alternative tools, with mixed but often positive results. Positive examples of the use of alternative tools include outcomes of the application of various military design thinking methodologies, the details of which are summarized elsewhere.[12]

Regardless of the possibilities offered by other tools, those who seek to defend the operations planning process could do better than to rely on the argument that “the only problem with the operations planning process is that we don’t use it!” As this article has demonstrated, this argument is both unproven and logically flawed. It is time to stop making it.

This article has been written in the author’s capacity as Distinguished Visiting Professor at Canadian Forces College. The views expressed herein are exclusively his own and do not reflect those of any organization with which he is, or has previously been, affiliated.

End Notes


[1] “Greybeard” is defined in the Urban Dictionary as: “An older man, usually someone with seniority in a business, a labor union or other community. Often used disrespectfully by younger members when referring to preferential treatment or outdated behavior”. The term’s usage in professional military education is similar. Here, “greybeard” usually refers to a retired senior officer who is employed (often as a contractor for a course or component thereof) to impart to students the wisdom gained from their extensive, but often dated, personal experience. Attributed to PrimalChaos, ‘Grey Beard’ (2nd definition), Urban Dictionary, 14 August 2010. Online: https://urbandictionary.com/define.php?term=grey+beard&amp=true, accessed 29 April 2019.

[2] For examples of such processes, see: Department of Defense, Joint Publication 5-0—Joint Planning (Washington DC: US Government Printing Office, 16 June 2017), chap. 5; US Army, Army Doctrine Reference Publication 5-0—The Operations Process (Washington DC: Headquarters, Department of the Army, May 2012); US Army, Army Tactics, Techniques and Procedures 5-0.1—Commander and Staff Officer Guide (Washington DC: Headquarters, Department of the Army, September 2011), chap. 4; Australian Defence Force, Australian Defence Force Publication 5.0.1—Joint Military Appreciation Process, 2nd ed. (Canberra, Defence Publishing Service, Amendment List 2, 2016); Strategic Joint Staff, Canadian Forces Joint Publication 5.0—The Canadian Forces Operational Planning Process (Ottawa: Department of National Defence, Change 2, 2008).

[3] I use this saying knowing fully that I am on shaky ground. A quick Google search reveals that the opposite saying, i.e. “the plural of anecdote is data”, is at least as prevalent and has been around for longer. Which of these two opposite sayings is correct depends entirely on what one considers to be legitimate research; this in turn often depends upon one’s field of study, and upon the paradigms or methodologies being employed. While it would therefore be accurate to say that “the plural of anecdote can sometimes be data”, in this case it is not because the anecdotal evidence offered during the conversations I have had comes from only one person’s unverified observations. As no particular method has been used to confirm the validity of these observations, they may be subject to factors such as confirmation bias, selective memory, etc. These factors are likely to rule out the sum total of these anecdotes as useful data, even in an exclusively empirical sense, unless such data is presented with several caveats regarding the extreme limits of its reliability. Ergo, even if one accepts that the plural of anecdote is data, these particular anecdotes still do not qualify as legitimate sources of data for the purposes of discussion in this article. For summaries of the history of both versions of this saying, see: Quote Investigator, ‘The Plural of Anecdote is Not Data’, Quoteinvestigator.com, 27 December 2017. Online: https://quoteinvestigator.com/2017/12/27/plural/, accessed 29 April 2019; Quote Investigator, ‘The Plural of Anecdote is Data’, Quoteinvestigator.com, 25 December 2017. Online: https://quoteinvestigator.com/2017/12/25/data/, accessed 29 April 2019.

[4] The original quote has been translated as: “Everything in war is very simple, but the simplest thing is difficult”. Carl von Clausewitz, On War, edited and translated by Michael Howard and Peter Paret (Princeton NJ: Princeton University Press, 1976), p. 119.

[5] If you are reading this article and you are interested in funding such a research project, please contact the author to discuss the matter further. I would be happy to undertake the project!

[6] For a broad-ranging critique that encompasses both of these aspects, see: Chris Paparone, The Sociology of Military Science: Prospects for Postinstitutional Military Design (New York: Bloomsbury, 2013), esp. pp. 90-97. Paparone construes planning as one “frame”, or lens, for viewing the world. His book reflects on the strengths and limitations of different frames commonly used by militaries to generate institutional understanding, before going on to propose new frames to enable new understandings of seemingly intractable military challenges.

[7] For example, Grant Martin discussed observations made during his time as an instructor on the US Special Forces Qualification Course. Observing 72 different planning teams, he found that those using the Military Decision Making Process spent more time completing procedural tasks related to the process and less time thinking about the problem than other teams using either Army Design Methodology or an unstructured approach. He also found that members of teams using the Military Decision Making Process were more likely to provide post-exercise feedback that they did not see what value several parts of the Process added to the planning outcome. Grant M. Martin, ‘Deniers of “The Truth”: Why an Agnostic Approach to Warfare is Key’, Military Review, January-February 2015, pp. 42-51. Online: https://usacac.army.mil/CAC2/MilitaryReview/Archives/English/MilitaryReview_20150228_art011.pdf, accessed 30 April 2019.

[8] This argument’s logic can be compared to many passages in Heller’s famous book Catch-22, which is full of examples of circular logic and self-reinforcing paradoxical or contradictory rules and situations. For example, the book’s titular “catch” is that: “Orr was crazy and could be grounded. All he had to do was ask; and as soon as he did, he would no longer be crazy and would have to fly more missions. Orr would be crazy to fly more missions and sane if he didn't, but if he was sane, he had to fly them. If he flew them, he was crazy and didn't have to; but if he didn't want to, he was sane and had to”. Joseph Heller, Catch-22: 50th Anniversary Edition (New York, Simon & Schuster, 2011), p. 46.

[9] For concise explanations of these fallacies, see: Bradley Dowden, ‘Circular Reasoning’, The Internet Encyclopedia of Philosophy, undated. Online: https://www.iep.utm.edu/fallacy/#CircularReasoning, accessed 13 May 2019; Robert Bennett, ‘Causal Reductionism’, Logically Fallacious [blog], undated but circa 2013. Online: https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/64/Causal-Reductionism, accessed 13 May 2019.

[10] Karl E. Weick, ‘Drop your Tools: An Allegory for Organizational Studies’, Administrative Science Quarterly, Vol. 41, No. 2 (June 1996), pp. 301-313.

[11] Ben Zweibelson, ‘“The Enemy has a Vote” and Other Dangers in Military Sense-Making’, Military Operations, Vol. 2, No. 2 (Spring 2014), pp. 20-24. Online: https://www.tjomo.com/, accessed 30 April 2019 (login required).

[12] For example, see: Philippe Beaulieu-B & Paul T. Mitchell, ‘Challenge-Driven: Canadian Forces College’s Agnostic Approach to Design Thinking Education’, The Archipelago of Design: Researching Reflexive Military Practices [blog], 13 January 2019. Online: http://militaryepistemology.com/challenge-driven/, accessed 6 May 2019; the volume of essays contained in: Journal of Military and Strategic Studies, Special Issue: Reflexive Military Practitioners: Design Thinking and Beyond, Vol. 17, No. 4 (2017). Online: https://jmss.org/issue/view/4498, accessed 6 May 2019; Aaron P. Jackson, ‘Towards a Multi-paradigmatic Methodology for Military Planning: An Initial Toolkit’, The Archipelago of Design: Researching Reflexive Military Practices [blog], 4 March 2018. Online: http://militaryepistemology.com/multiparadigm2018/, accessed 6 May 2019; Ben Zweibelson, ‘Change Agents for the SOF Enterprise: Design Considerations for SOF Leadership Confronting Complex Environments’, Special Operations Journal, Vol. 3, No. 2 (2017), pp. 127-140.

 

About the Author(s)

Dr. Aaron P. Jackson is Academic Year 2018-19 Distinguished Visiting Professor at Canadian Forces College in Toronto, Ontario, where he instructs on the Advanced Joint Warfare Studies Course. A career Australian Department of Defence public servant, he is currently appointed Senior Researcher—Joint Planning and Design in Defence Science and Technology (DST) Group. His previous civilian appointments include Joint Operations Planning Specialist in DST and doctrine developer at the Australian Defence Force Joint Doctrine Centre. In addition to these civilian appointments, he has served in the Australian Army Reserve since 2002. He has deployed as a civilian on Operation Accordion (Middle East region) and as an Army officer on Operations Astute (Timor Leste) and Resolute (Australian border security).