Small Wars Journal

The Primacy of Information Intelligence

Mon, 09/30/2019 - 9:48am

The Primacy of Information Intelligence

Thomas A. Drohan

Our need to scrutinize information becomes more acute as automation outstrips human understanding. If machine-learnt becomes machine-taught, we may lose the power to make responsible decisions. At the same time, our drive for technological advantage creates dependence on complex networks such as the “collaborative sensing grid.” Uncertainty persists. Interactions among humans and self-aware chatbots produce unpredictable effects Russian scientists describe as “a fundamentally different reality.” Clearly we need information intelligence to evaluate the integrity of content, processes, and sources.

Integrity is a vital variable in cooperative as well as confrontational relationships. Whether it’s keeping one’s word or securing internet protocols, the question of trust is a fundamental concern. Military deception (9) uses perceptions of trust to mislead adversaries. So we need to assess integrity to anticipate manipulations of expectations and data. The unprecedented expansion of venues to exploit implicit trust presents inexhaustible possibilities for weaponizing information. Uncertainty multiplies as actors (including machines) seek more data to place into more contexts, and assess more information with potential to influence more outcomes. How do we know what and whom to trust as we thrive on open exchanges of ideas, markets, and technologies?

We answer this question by exploring a concept for information intelligence. We define it, place it into a contested context, critique related doctrinal shortfalls for information (Joint Publication 3-13) and intelligence (Joint Publication 2-0), and discuss how to assess it.  

What is I2?

Information intelligence (I2) is about what and whom to trust. Let’s break down the definitions of information and intelligence separately, then combine them into a more holistic definition.

If we imagine something that is not information, what would we think about? Meaningless data is not yet information. The key is context. Data has potential to mean something if used in a context. This acquired meaning includes false data, such as the flood of disinformation splattered out from Russian bots and trolls. Information is ubiquitous (see Gleick’s The Information), with innumerable contexts and definitions. Perhaps that is why we see circular definitions of information (does not use “information” to define “information”) in the DoD Dictionary of Military and Associated Terms. The reference lists types of information, and concepts that relate to it: activities; environment; info power, info-related capabilities; info warfare; etc.

One broad yet non-circular academic definition of information is: values of characteristics in the output of processes. That is, processes produce outputs with characteristics on which we can place a quantitative and/or qualitative value. We use the following definition of information that accommodates technical and social processes and meanings: the characteristics of data put into context. The values of characteristics apply when we assess information in terms of qualities or quantities.

Unlike information, the term intelligence does have a disprovable (we can test what it is not) DoD dictionary definition, and it includes products, activities, and organizations. Intelligence is: the product resulting from the collection, processing, integration, evaluation, analysis, and interpretation of available information concerning foreign nations, hostile or potentially hostile forces or elements, or areas of actual or potential operations; activities that result in the product; and organizations engaged in such activities (115).

Combining the preceding two definitions yields the following: information is data in context, while intelligence is collecting and thoughtfully processing information. We are interested in how to evaluate information with respect to trust, so we refine this rendition. Information intelligence is the integrity of contextualized data and processed information. Now we place this definition into a relevant context — the highly interactive and contest of alternative data, information, and intelligence.

The Competitive Context of I2

I2 is applicable to all types of intelligence, but is particularly meaningful for strategic intelligence and anticipatory intelligence. The US National Intelligence Strategy (8, 9) defines those types as:

Strategic intelligence: process and product of developing the context, knowledge, and understanding of the strategic environment required to support U.S. national security policy and planning decisions.

Anticipatory intelligence: collecting and analyzing information to identify new, emerging trends, changing conditions, and undervalued developments, which challenge long-standing assumptions and encourage new perspectives, as well as identify new opportunities and warn of threats to U.S. interests.

The question of what and whom to trust applies to all situations because uncertainty is pervasive. In the information environment (IE), the overriding context of trust is that it’s contested. Actors fight for the kind of information and sources they need to compete and prevail. If we consider four contested purposes of strategic and anticipatory analysis, four types of competition emerge:

  • Acquire and place vast amounts of data into relevant contexts. This collection competition (see example of a broad approach) concerns capabilities to acquire data that can connect motives, patterns, technologies, and infrastructures.
  • Assess meaning in that information. This tradecraft competition (see example of structured analyses) includes analytic techniques, frameworks, models and experiences.
  • Focus that intelligence on desired and anticipated outcomes amenable to being influenced. This design competition (see example of a business perspective) competition seeks to imagine, explore, test and operationalize effects and assessments.
  • Develop proactive leaders at every level of the information and intelligence enterprise. This leadership competition (see example of a service-specific handbook) is about envisioning goals and motivating people to achieve them.

These four fights interact with one another, regardless of whether we execute them that way (such as a joint force) or not. As we collect, conduct tradecraft, design operations and develop leaders, the integration of these processes is a competition to out-think and out-execute rivals. I2 is important because contextualized data and processed information enable intelligence to achieve goals. Without information intelligence, operations are more prone to produce inconsequential effects, generate unanticipated consequences, and fail to anticipate strategic surprise. Let’s see what information and operations doctrine says about information and intelligence.

Information Should Control Operations

Information operations doctrine has been expanding to include more variety of information-related capabilities so that the joint force commander can have all appropriate tools available. Joint IO doctrine describes information effects rather broadly as influence, as well as a narrower set of effects—disrupt, corrupt or usurp (adversaries) and protect (our own). The Joint Weapons School teaches effects in terms of operational fires that are direct, indirect, cumulative, cascading, unintended, lethal and non-lethal.

Informed by the Joint Concept for Operations in the Information Environment, information-related capabilities have been updated to include “Tools, techniques, or activities using data, information, or knowledge to create effects and operationally desirable conditions within the physical, informational, and cognitive dimensions of the information environment” (I-4). Executing this conceptual approach to information is limited more by attitudes than by technologies. An entrenched mindset bars the way: information is subordinate to operations.

As long as information is subordinated to “operations,” we will fail to realize a full range of effects. This mindset is fatal to competitive strategy because nuances in operations can create considerable differences in information effects. Consider advisor missions that teach campaign-level command and control of capabilities, compared to missions that instruct maneuvers and marksmanship. Both sets of skills need to be mastered, but C2 is needed to create strategic advantages. Why? Information should control operations. Decisions about how to adjust operations for desired effects depend on information. In this important sense, information can and ought to be supported by, operations. How?

Asking the question, what do we, and relevant competitors, want to cause and prevent? leads to the kind of information we want as desired effects. Forces need to survive and be able to inflict kinetic destruction, but what are the next-order effects? What behaviors do we need to prevent and/or cause? That is the information needed to craft combinations of consequential operations. Routinized organizations are slow to recognize the need to change operations to be more effective. Indeed it has been nearly 20 years since now-Lt Gen (ret) Dave Deptula argued that the Information Age precipitated no less than a change in the nature of warfare. We have made phenomenal improvements in rapid precision fires, but we often fail at the primary consideration in strategy: specifying feasible strategic effects.

Quantum and Information Age technologies will increasingly punish such indecisiveness, if not incompetence. At the same time, opportunities to control an adversary’s ability to act will broaden. Unless human nature changes, threats will adopt any available means to achieve desired effects. Indeed, the US National Security Strategy (NSS) calls for information effects. All of the objectives under NSS Goal 4—Advance American influence—seek favorable information conditions. The other three national goals are also informational effects to be operationalized: protecting the homeland, promoting prosperity, and establishing peace through integrated power.

More leaders are conceding that “information is operations” as it becomes obvious that information creates effects. Leadership needs to advocate innovation, not just follow it. In 1948, Bell Labs announced that its new piece of hardware—a tiny electronic semi-conductor marketed as a “transistor” to replace vacuum tubes—“may have far reaching significance in electronics and electrical communication.” That same year, Claude Shannon invented a theory of communication which created the bit (binary digit 0 or 1). This concept provided a quantifiable measurement of information which, combined with the transistor, revolutionized electronics and expanded human awareness with reams of information.

As we sense more, we seek and are vulnerable to more avenues of influence. It follows that we need broad operational constructs to create information appropriate to relevant audiences. The concept should include a wide variety of operations such as: conveying local events on a global scale; broadening the scope of diplomatic issues; increasing public awareness of economic competition; exposing groups to different cultural practices; exploiting the band-width of electromagnetic waves; countering and spreading internet memes; and incentivizing humanitarian norms in social behavior.

Whether the above influence operations are conducted by humans and/or machines, the information generated provides new opportunities, poses new threats, and changes how how we think about operations. Meta-cognition, big data, uncertainty, and the hopes and fears of artificial intelligence, for instance, involve arranging, processing and assigning meaning to data. To compete well, we need to understand the many ways information is created, believed, and applied.

Information Should Broaden Intelligence

Joint intelligence doctrine defines intelligence as new understanding of information, the purpose of which is to provide assessments to commanders (JP 2-0, ix). This narrow definition fits within the DoD dictionary definition. Granted, doctrine is supposed to only be authoritative and not prescriptive so why not have a more focused definition? The problem is that in practice, many misconstrue doctrine as a place to end rather than begin thinking. Narrow doctrine gets institutionalized and constrains thinking, unless we invite critique.

For instance, JP 2-0 suffers from an organizational definition of what is deemed to be strategic, operational and tactical levels of war (I-24). That is, strategic intelligence is what is provided to senior leaders that could impact national security interests; operational intelligence is what is provided to combatant and joint force commanders, and tactical intelligence is what is provided to commanders planners and operators for battles, engagements and missions. How is this a constraint?

In the Information Age, any organizational level can create information and operations that impact national security. The validity of this assertion becomes clear when we take a perspective of measuring effects and not just measuring performance.

A measure of performance (MoP) assesses the degree to which an activity or task is being conducted to relevant standards. In the IE, this measurement can be for any actor, not limited to “friendly” forces. A quantitative example given in a Joint Chiefs of Staff Insights and Best Practices Paper (4) is “Number of IEDs discovered,” while a qualitative example is “Integration with supporting commanders.”

A measure of effectiveness (MoE) assesses desired changes in behavior, capability or environment based on mission objectives. The best practice quantitative example related to the preceding MOP is, “Number of IED discovered vs number of IED effective attacks,” while the qualitative example is, “Sentiments of Host Nation leaders and populace on the security situation.” In both sets of examples, the MoE adds more information that relates to desired changes in support of goals.

The MoE perspective on assessing desired change permits any organizational level to be strategic with respect to goals. Technological advances in distributed capabilities enable more actors to generate strategic effects. Many of the same technologies also enable commanders to micro-direct rather than distribute control. When this happens, who is thinking about third and fourth order effects? Our answer must not be, no one.

As doctrine describes an over-rigid “nature of intelligence” (JP 2-0 Chapter 1, which includes “levels of war”), it also proscribes a general relationship among data, information and intelligence (I-2) that modern sensors have compressed.

1

The ability to do on-board collection and processing, as well as networked intelligence functions, means that some sensors can provide data in context. Moreover, the functions of collecting, processing and exploiting, and analyzing and producing, can happen more quickly and systematically. As multi-domain operations creates networks of these processes, we need to be responsible for determining the adequacy of the information.

This task is more than machines filtering and fusing intelligence. The process requires trusting machine data and human-machine interfaces—what some refer to as “explainable intelligence.” In this important effort, the human-machine interface is crucial to understanding machine-processed information. As in human intelligence, and particularly as AI becomes more complex, humans can add context and judgment.

Assessing I2

Competitors with long views are contextualizing data and processing information for strategic effects. Russian reflexive control and Chinese informationized warfare create malign influence we have not adequately countered. The race for AI and quantum applications is on as well. We can assess tangible destruction (Russia in Ukraine) and construction (China in disputed territory of the South China Sea), but how do we assess the information intelligence of advanced hybrid operations?

If we understand how actors contextualize data (which can be false data, too) and process that into information (narratives), we can develop strategies to contest and preempt effects. This effort may include conflict short of, or combined with, armed force. There is an expanding literature on narratology (section 3.5) and narrative warfare about the many contexts, intellectual and emotional processing, and strategies of narratives. Our transmission-centric joint doctrine appears to be in a slow turn to embrace such techniques.

How should we approach assessing broad information intelligence in a contested environment of distributed operations struggling for control? Let’s start by considering how to assess trust with respect to information in terms of validity and context.

Validity. There are many measures of statistical validity which may be translated as reliability. For our purposes, validity may be thought of as reliability and relevance…is the information an indicator (reliable) of what it represents (relevant), or is it being selectively misused? There is so much data that information is based on a limited selection of data put into context (or no data at all).

This judgment of reliability and relevance closely conforms to a broad interpretation of “construct validity” — the ability to generalize. A good way to visualize this, because it’s easy to relate to lines of effort, is the following depiction. When we infer from a specific program or activity to a more general result or effect, we have constructed a cause and effect argument. Note the key question: “can we generalize to the constructs?”

2

https://socialresearchmethods.net/kb/considea.php

Context. There are many contexts into which we can selectively place data, to construct information. Contextualizing data requires expertise. Placing electronic emissions into the context of known signatures takes a different expertise than interpreting source code or recognizing cultural nuances in spoken language. To illustrate a context, let’s assume that the following is important to a commander: the impact of the data, its cost, and its trustworthiness as information. To become information, the data may be machine-processed or human expertise-derived.

Next, consider impact and cost, matters of interest in most assessments.

  • Impact. We relate the information to our strategy that links activities to effects to objectives to goals. That is, how important an impact will the information likely have?
  • Cost. We estimate what is relevant to the desired effect and the environment. Costs are DIMES-wide and vary by relative importance to impact, time frame, and actions or forgone opportunities.

The following depiction illustrates that impact, cost, and trust (validity and context) can be be judged together along a low-high range of quantitative or qualitative values. The three factors can be weighed for their relative importance and a composite score computed to help judge what the relevant information intelligence is for the situation at hand. The same method applies to individuals and groups. The key questions are: how trustworthy, and what is the impact of, their information or intent; what are the costs involved?

3

Conclusion

The age-old question of how we can trust information and sources will become more problematic as automated technologies synthesize new solutions, thereby challenging human judgment. We answered this question by providing a definition for information intelligence in contested contexts, advocating information-controlled operations with broad intelligence, and discussing a way to visualize assessment.

Beginning with the integrity of contextualized data and processed information, we described the competitive context of collection, tradecraft, design and leadership. Information control over operations effectively weaponizes information to persuade, dissuade, deceive, induce, deter, and achieve other effects, both directly and indirectly. Operations should embrace such expansive effects as information broadens intelligence and machine processing capabilities proliferate. The integrity of information is a basic source of influence and security.

Information intelligence requires re-interpreting current doctrine in light of new technologies. The need to update guidance based on experience may be a best practice, but it is not a sufficient practice. Doctrine lags new contexts during times of rapid change. While doctrine is supposed to be authoritative and not prescriptive, most analysts, planners, operators and commanders embrace its models to begin thinking about problems.

To ensure that doctrine does not end new thinking, we should expand how we understand information and how we permit ourselves to develop intelligence for multi-domain, all-effects strategies. Continuing to integrate separate efforts will produce less proactive effects and develop less anticipatory intelligence. On top of that, if control is over-centralized rather than distributed, information flow will become constricted. This predicament leads to less self-awareness and is an easily exploitable strategic vulnerability.

By blending information and intelligence into a unified concept, we may also overcome two popular premises that destroy initiative. The first is that information is relevant when it supports operations. The second is that intelligence is about collecting data then analyzing it as a product for operations. Both assumptions limit how we can wage and win complex warfare.

In addition to mustering specialized information-related capabilities for desired effects, we should create interactions of information-related effects. Information-related effects focus attention on the fight for superior purposes of strategy— effects, objectives, priorities, and goals. This competition can broaden our options. Options such as, kinetic capabilities supporting information effects, rather than presuming it’s the other way around. Courses of action would integrate various combinations of effects not just capabilities.

Information intelligence is a basic requirement to compete at this level of warfare.

About the Author(s)

Dr. Tom Drohan, Director of JMark Services Inc.  International Center for Security and Leadership, is a retired U.S. Air Force brigadier general and professor emeritus of military and strategic studies, USAF Academy. His 38-year career as a pilot and permanent professor included operational campaigns and commands, undergraduate and graduate-level teaching, and educational leadership. His academic experience includes B.S. in national security studies (USAF Academy), M.A. in political science (University of Hawaii), Ph.D. in politics (Princeton University), Council on Foreign Relations fellowship in Japan, mentor at the National Military Academy of Afghanistan, visiting scholar at the Reischauer Center for East Asian Studies, and dean of the United Arab Emirates National Defense College. He is the author of American-Japanese Security Agreements (McFarland & Co., 2007), A New Strategy for Complex Warfare (Cambria Press, 2016), and various publications on security and strategy.

Comments

AllenWalter

Fri, 09/24/2021 - 9:50am

We are committed to providing our clients with exceptional solutions while offering web design and development services, graphic design services, organic SEO services, social media services, digital marketing services, server management services and Graphic Design Company in USA.