The SciOps Conundrum: A Case Study on Applied Analytics
The SciOps Conundrum: A Case Study on Applied Analytics
By Maj. Gen. Patrick B. Roberson, Maj. Stuart Gallagher, and Maj. Kurtis Gruters, PhD
The year is 2030. The troops stand in precise rank and file, ready for inspection. Their pristine metallic forms reflect the bright lights of the staging hangar where they await orders for their next mission. After days of patient waiting, the elite T-1000s receive their much anticipated download. Green light: Return to 1995 – engage and eliminate John Connor.
Hollywood loves its military use of artificial intelligence (AI),[1] but reality may comfort movie buffs and doomsday seers alike: AI and other advanced analytics have far more mundane uses in routine military processes. From managing and shaping human talent to predictive maintenance of vehicles and equipment,[2] data and analytics remain pragmatic and practical. That pragmatism is due in part to what is realistically possible from the scientific perspective, and in part to what is actually useful from the operational perspective. Fully autonomous hunter-killer robots are scientifically possible, but raise a tremendous host of ethical, moral, and operational issues that make them highly controversial and operationally useless in modern warfare. Meanwhile, time travel might be operationally valuable, but is not scientifically possible…yet. For those concerned about Mr. Connor, take heart: neither are likely to change in the near future.
Technology has become more routine in everyday life, driven largely by what we like to refer to as tiny, rapid, incremental gains (TRIGs) within the civilian sector. This is happening at a time when the world is entering a new phase of Strategic Competition between major military states — particularly, the US, China, and Russia — perhaps driven by the proliferation of these same TRIGs across competitive global civilian markets.
As technology proliferates, it will become increasingly more important to effectively integrate hard science into operational practice. This does not mean that operational units should be looking to build sophisticated moon-shot labs; they still have a job to do. Emphasizing scientific advancement, as opposed to utilization, will distract from the mission. Instead, it means that scientists must become organic assets to operational units in support of the core mission of that unit. This sets up a conundrum: how does a unit allow their scientists to think, dream, and push the boundaries of implementing scientific and technological capabilities while also focusing and constraining that work based on operational requirements and realities?
This article discusses a case study of just such a process at the United States Army John F. Kennedy Special Warfare Center and School (SWCS), the primary training center for Army Special Operations Forces (ARSOF). The intent is to capture and convey our lessons learned in building this organic integration of operations and science, thus demonstrating how SWCS began to navigate the SciOps conundrum.
Background and History
SWCS’s official mission is to “assess, select, train, and educate the U.S. Army Civil Affairs, Psychological Operations, and Special Forces by providing superior training and education, relevant doctrine, effective career management, and an integrated force-development capability.” Put another way, SWCS is directly responsible for talent acquisition through assessment, selection, and initial training of new ARSOF soldiers, as well as advanced training of the operational force. Additionally, SWCS supports US Army Recruiting Command’s Special Operations Recruiting Battalion (SORB) and 1st Special Forces Command (1SFC) with recruitment and retention respectively. In all, SWCS handles a similar student load to that of a mid-sized college, educating over 20,000 students annually.
SWCS’s processes have historically been well grounded in science and data, owing many of the training principles and processes to the Office of Strategic Services (OSS). During World War II, the OSS was charged with generating a force to conduct unconventional warfare and intelligence collection behind enemy lines, under extremely dangerous conditions and with a high degree of uncertainty. To do this, they developed a scientifically rigorous approach to selecting and training personnel that has developed into the modern processes for generating ARSOF.
Until recently, this process has generally used data to address a specific question at a specific time. This transactional approach to data tends to be somewhat more reactionary and leaves the major decision making to human knowledge and intuition. While this has served the leadership — and the nation — well historically, modern-day demands and challenges in talent production and management are becoming more complex, requiring more comprehensive, yet subtle solutions.
These new requirements have driven the demand for a switch from transactional to transformational data use – an approach that proactively anticipates and solves problems in advance to supplement human knowledge and inform decision making. It is the approach of continuing to find value in data well after the data is used for its initial transactional purpose. This transition is a significant cultural and administrative shift for such a large organization. It has not been until recently that technology has reached a point where it is both possible and beneficial to make the change.
The solution to this transformation was first conceived in 2020 in the newly formed G35 or Plans and Analysis section (a subsection of SWCS G3 Operations). The shop was tasked by the command with modernizing data access and use by, with and throughout the command. After analyzing the problem for approximately four months, it was recognized that SWCS had many information management systems, but no way for decision makers to efficiently and effectively consume or analyze that information. What SWCS needed was an information system designed specifically for process management. This led to the development of SOFMIS — or the Special Operations Forces Management Information System — a data architecture on the surface, but more importantly a mindset and approach to consolidating and processing data to inform decision making.
Development of a Data Strategy
The development of SOFMIS began by first considering how to more effectively and efficiently collect and store data at an enterprise level. The prevailing paradigm of transactional data use supported this step without significant challenge to the culture. However, it soon became apparent that this approach was not sufficient for the desired end-state of actually putting the data to use at scale.
To address this concern, the Plans and Analysis section acquired an information scientist who had previously been working on talent management related data analytics projects at the United States Army Special Operations Command (USASOC). He was able to provide the scientific expertise necessary to compliment the operational capability already on the team. In turn, this allowed the team to develop a comprehensive data strategy that would soon pave the way for a cultural transition to transformational data use.
The ensuing discussions about how to frame a comprehensive data strategy set a precedent that would prove highly effective in building SOFMIS both digitally and conceptually. Indeed, the team has felt it valuable enough to capture explicitly for posterity. On the surface, it is the common-sense application of separate, but complimentary roles: Allow operations to guide scientific development, but allow scientists to lead the science. In theory this sounds simple; in practice, however, this is a far more nuanced challenge that requires significant trust and understanding between operational and scientific experts and has profound implications on command structure and its interactions.
The core SOFMIS strategy can be summarized as a form of Lean Six Sigma approach that reduces cost while increasing quality and optimizing output (not increasing output for the sake of increasing output as has sometimes been inferred in the past). Again, on the surface this is not particularly profound; one can find this strategy in any production system, whether focused on goods or human capital. For a data strategy, however, this needs to be operationalized in a meaningful and feasible way, particularly when it requires deep cultural change at the same time.
To develop this strategy, the Plans and Analysis section interfaced closely with leadership, staff and academics alike to understand and frame the operational intent. Meanwhile, the information scientist considered how to turn this intent into feasibly testable hypotheses. Importantly, projects were limited to those with operationally useful outcomes while avoiding those questions that may be interesting but not explicitly meaningful at that particular time.
Given the state of the organization, this last requirement limited the scope of the “sexier” machine learning or artificial intelligence problems, which the culture was not ready to embrace. Instead, the emphasis in early projects was on more traditional statistical methods and process simulation. The ongoing goal here was, and continues to be, to build trust and organizational capability allowing for more complex analyses in time — driven, as always, by operational need and feasibility.
Ends-Ways-Means in Data Strategy
In order to develop the data strategy, the SOFMIS team adopted the Ends-Way-Means strategic framework. Although the Ends-Ways-Means methodology is usually utilized for strategic military planning, it was found to be quite useful in building a data strategy as well. Briefly, the Ends is the ultimate goal, defined by the strategic needs and intent. The Ways are the actions taken to get to the goal, and the Means are the resources available to achieve said ends; in the data domain, these are all things analytics and all things data respectively.
Quality, Throughput, and Efficiency: The Ends
The explicit end-state of the SOFMIS strategy is to reduce cost while raising quality and optimizing output at the school house. Individual projects feeding that are generally broken down into three categories: 1) Articulating risk that some process or decision will not result in forward progress toward that end-state; 2) Balancing the availability of information to all parties involved in the process, including recruits, students, operational soldiers, and commanders; and 3) Ensuring appropriate data and resource availability. Without these Ends, the Ways and Means would not have a focused purpose.
Analytics: The Ways
Analytics is what turns data into actionable knowledge in support of the strategic ends. In these Ways is where the interaction between scientists and operational leaders becomes so important. The operational leader should provide intent, then act as the gravitational force to keep the scientist focused on the intent; however, that leader should also try to avoid leading questions. The scientist, meanwhile, should always be looking for the intent behind a question and should be willing and allowed to challenge the premise of the question. This two-way trust is essential to framing and answering the correct questions rather than answering the wrong question(s) perfectly.
As a simple example, we might consider a commander asking what the ASVAB’s GT score (that is, the General Technical scale from the Armed Forces Vocational Aptitude Battery; a test administered by the United States Military Entrance Exam Processing Command to determine qualification for enlistment) threshold should be for a recruit. This is not a meaningful scientific question, nor is it operationally relevant as stated, yet questions of this nature are extremely common. Consider the intent of the GT threshold: to ensure that those who are recruited are sufficiently intelligent to do the job. We know this is not a hard line, and that those with low GT scores may have higher risk of failure, but there is no singular threshold where those above will succeed and those below will fail.
The question grounded in operational intent, therefore, should be phrased as, “How can GT show me risk on a recruit?” Scientifically, this can be articulated with the testable question: “What is the relationship between GT and risk of mission (or training) failure?” One might also ask, “What is the cost of accepting greater risk on the GT to bring someone into training versus not bringing the person to training at all?” Either question (or both) addresses the operational relevance of the problem and allows the commander to consider where and how to accept risk rather than remain beholden to some arbitrary line in the sand.
Consider a second example with respect to balancing information. A potential recruit is going to have some opinion of ARSOF, good, bad, or otherwise. In this case, it is important to make sure the recruit is informed about ARSOF and ARSOF about the recruit before deciding to invest in each other. The operational intent, therefore, is to provide a balanced and accurate portrayal of the organization to the candidate and vise-versa. There is a tendency to emphasize collecting data on the candidate, but analyzing, understanding, and marketing the organizational culture is equally important, particularly in a tough recruiting environment.
Both examples can easily misrepresent the underlying problem. The operational leader can ask a well-intentioned, but misleading question, while the scientist can easily chase irrelevant threads. Through trust and open discussion, the two can leverage their respective expertise to keep the solution focused on the strategic (or operational) end state while keeping it scientifically feasible and testable.
For leadership, the following specific recommendations have been useful with regards to interactions between operations and science:
- Ensure that you have at least one true scientist with deep and comprehensive research background on staff. This is different than having someone with an interest in numbers and a knack for Excel. It is instead a PhD with demonstrable capability and a desire to support the mission. Consider the difference between hiring a medic versus a surgeon; investing in this level of talent is an investment into an organization’s future readiness.
- Give intent and boundaries, not specific questions. This allows the scientist to develop the problem into testable hypotheses, articulate the requirements, and help prioritize problems based on resourcing.
- Expect results backed by sound science. Hold the scientific team accountable and set expectations. If necessary, allow the scientist to go into detail so that he/she can articulate the logic, even if leadership does not care for that level of specificity.
- Understand that every request takes time and resources. Do not expect miracles, but when something just shy of a miracle is needed, be willing to put the resourcing behind it.
Likewise, the following points are recommended for scientists:
- Communicate, do not lecture. Avoid giving a deeply detailed academic brief. Trust your operational counterpart(s) to help understand what level of detail is important from an outsider’s perspective, even if more detail feels necessary (often, it is not).
- Build trust. This can mean many things: get wins that leadership cares about; be candid with feedback and honest with results; be ethical in developing research protocols.
- Practice within the scope of your training. Be humble and honest with your background and do not fall into the trap of over promise, under deliver. Not only does that degrade trust in the research team, but it also destroys trust in the field and the process as a whole.
- Practice down and into the organization, network up and out. Unless you are part of a major research organization, you are not likely to have a large team working on revolutionary problem sets that will change the world. Focus on your organization’s data and problems, and build relationships with others who are working on relevant projects outside of your organization.
These common sense recommendations have proven vital for SWCS and set the guiding principles for its work. Most success was realized when, across the board, the organization has stuck to these points.
Data: The Means
Data should be treated as a strategic resource, not just a nicety. Collecting too little data results in insufficient ability to build and test models and hypotheses. Failing to collect data now means it will not be available to use in the future. In general, storing data is cheap. As the old Army saying goes, “it is better to have and not need, than need and not have.”
However, saying “collect all of the data” is also not a useful approach. Too much data can become burdensome: At best, it is redundant, but more likely, it distracts efforts and takes resources away from collecting, processing, and analyzing relevant data. Focus on measuring what matters. Of course, this depends on available resourcing. More resources allows the team to collect and process more data, even if more data after a certain point provides diminishing returns for more cost. If a set of data is collected inappropriately, it still cannot be used just as if it were never collected, but now it has cost both time and other resources. Before initiating a line of data collection, be sure sufficient capability and resourcing is available to process and make use of the data, and that the data will serve the strategic ends of the organization.
The question of how much is the right amount again comes back to a discussion between operational and scientific leadership. It is easy for operational leaders to advocate for some wearable device or to monitor one more variable in routine data collection without understanding the resourcing needed to do so. The scientist should be willing and able to articulate what that will involve and whether it is likely to give enough value to validate the investment. The team may decide to pilot test a data collection method to test if the required resourcing and expected return can scale. Ultimately, the data demands are driven by the current posture of the organization’s culture and resourcing. What it infeasible today might be totally worth the effort tomorrow, but moving too soon might prevent the organization from ever getting there at all.
Closing thoughts
The world is changing fast and tiny, regular, incremental gains in the digital domain are at the heart of this change. Although we may not (and should not) be at a point where we can field anything resembling the T-1000, the impact of technology on national defense is no less the stuff of Hollywood’s finest techno-thrillers. As civilian technological development bleeds more and more into strategic posturing and pre-kinetic conflict, it is a sure bet that the same technology will become increasingly more relevant in readiness for, and execution of, kinetic operations. The ability to efficiently collect and get value out of data is becoming an operational imperative, requiring that operational leaders learn how to incorporate scientific practice into routine processes and that scientists learn to work seamlessly alongside operational leaders. There will always be room for world-changing military research at dedicated labs, but trickle-down technology can no longer keep up with the pace of operational needs. SWCS has come a long way in the past two years since starting this journey, but there is still a long way to go in this fast-paced and ever changing world. Hopefully SWCS’s lessons learned can continue to speed the transformative process while simultaneously leading the way for other government organizations to do the same.