Small Wars Journal

Towards a Broader Definition and Understanding of the Human Dimension: Part 2

Wed, 09/04/2013 - 5:48am

Towards a Broader Definition and Understanding of the Human Dimension: Part 2

Michael L. Haxton

This is the second in a series of three articles that discuss analytics of the human dimension of conflict. Much is being written on the human dimension of conflict, but much of it is using differing versions of what is important. The first article focuses on defining the concepts for analyzing the human dimension of conflict.  Next, there is a growing reliance on analytics to understand the world around us, but caution must be exercised when applying analytics, especially in complex environments, like the human dimension of conflict.  This second article focuses on proposing a set of important principles for guiding analytics of the human dimension.  Lastly, the human dimension involves are wide range of disparate types, formats, and sources of data.  This involves significant and complex challenges for efforts to capture and organize it.  The last article discusses the types and sources of data that must be accounted for in the human dimension. Part one can be found here.

Principles of Conducting Analytics of the Human Dimension

While analytics is a powerful capability, it can also be misapplied and contorted into incorrect applications.  There are three critical principles for analysts to keep in mind when designing or implementing analytics of the human dimension. 

  1. Analysts must explicitly recognize the assumptions behind the analysis and match the analytic techniques to the problem and assumptions.
  2. Analytics is not just about large amounts of data; small N (qualitative) analysis techniques are highly valuable and valid forms of analytics.[i]
  3. Multiple perspectives, levels of analysis, and data sources are critical to obtaining a holistic understanding of a system.  For this reason, Mixed Method research designs are preferable whenever feasible.

On the first principle, in the design and conduct of Analytics, the analyst must recognize that the data is the product of (at least) two processes: (1) the process of the underlying system of interest (e.g., the stock market, voters in New Hampshire, insurgent groups operating in Kandahar), and (2) the data collection processes in which elements of the system of interest reveal themselves to the analyst through some medium and the analyst records the data from observation through that medium (e.g., stock market indices or trade data, opinion poll of likely voters in New Hampshire, intelligence on who is doing what in Kandahar). To provide valid and accurate analytics, the analyst must account for both the data process and the system of interest.  In effect, the analytics must include assumptions as to how both of these operate (or include hypotheses to be tested).  These assumptions are crucial to selecting the most appropriate techniques and also in understanding the limits to the inferences that are possible.  One of the most pronounced faults in the general use of analytics is over-extending findings or failing to recognize the limits of the findings.

For example, consider the problem of trying to determine whether a particular social movement is beginning to support the employment of extremist violence to achieve its objectives.  The movement does not have a roster of members, they do not have ‘hash-tags’ that identify them as members or adherents, and they do not dress a specific way that identifies them as belonging.  In short, we don’t know who they are, but we need to understand their attitudes somehow.  One possible data source to get at this is through monitoring exchanges on social media platforms associated with the movement.  However, in doing so, the analyst must define assumptions about how the movement works (self-organizing, centrally directed, franchise-like, etc.), how adherents to the movement operate through the selected social media platforms (do they exercise operational security, propaganda, etc.), and how the users of those platforms relate to the movement leadership and the rank and file membership.  Much of this may not be knowable, but to use this as a data source to answer the underlying questions requires the analyst make the assumptions explicitly.  We must remember that we are not interested in the social media data for its own sake; rather, it is of interest if it can give us insight into the attitudes of the movement (somehow defined).

The final point above is a key one to reiterate.  Some purveyors of analytics, generically, suggest that “data analytics” is the study of data.  While the definition may be valid in and of itself, the value proposition it conveys is extremely limited.  Analysts want to understand their data, but studying data—be it web traffic, social media behavior, opinion polls, or stock markets—is only useful if it leads us to understand what the data represents and how it represents a system of interest.  A population survey is great data, but generally speaking, its utility is limited to the extent that it can be used to understand the population in which the survey is conducted.  There are many algorithms, measures, and procedures we can run on the data, but many of them are only useful for understanding the character of the data, not in understanding the population the data is meant to represent.  We must understand how the data was collected in order to use it to understand the population from which it was drawn.  This principle holds in any field of analytics, but especially so in analytics of the human dimension.

In general, this first principle of analytics involves three key aspects.  First, the analyst must explicitly develop and account for their assumptions about how the underlying system of interest functions (which may be to make no assumptions at all).  These assumptions inform and constrain the choice of data sources and analytic techniques (see bottom layer of the figure above).  Second, the analyst must devise the means to observe the system of interest, either directly or indirectly, and then develop and account for the assumptions about how that data is to be or has been captured.  There are an abundance of choices in the nature of the observation, a facet that will be discussed in detail later in this paper (see middle layer of the figure above).  Third, the analyst must select the most appropriate analytic method given the problem, the assumptions regarding the system and data, and not least, the time and resource constraints for conducting the analysis (see top layer of the figure above).  As these analyses are conducted in the real world on real problems, time and resource constraints are absolutely crucial.  For instance, employing a slow, very detailed, high fidelity solution to a problem with limited budget, sparse data, and short turnaround time constraints is useless.  Equally as important, it is also crucial to pair the analytic method to the problem and data.  For instance, using a theory-driven Bayesian Network Modeling approach to understand the basic construct of a phenomenon of interest and with no readily available source of knowledge or data is ill conceived.  It would be better to use an exploratory, qualitative, field research approach to collect observations and develop an initial understanding about how the phenomenon occurs—it is an approach that would be much better aligned to the problem and data issues.

This brings the discussion to a critical juncture and to our second principle.  A large number, if not most, purveyors of analytics operate on quantitative data almost exclusively.  Everything, from large data stores to small datasets, is analyzed through a variety of powerful techniques.  However, we often lose sight of a fundamental truth.  One must collect the right data for the analytic objective and quantitative data is not always the right data for the purpose, especially in the realm of the human dimension.  When conducting analyses of known human systems with well-founded variables to measure, quantitative data collection should be the central approach used.  However, the power of qualitative data cannot be overlooked as it allows the analyst to explore the human system without assumptions about what is important; it allows the analyst to discover.  Using quantitative data restricts the analysis to the variables the analyst either chooses to collect (by assumption, choosing what they believe matters) or happens to have access to in the selected data sources (constraining analysis arbitrarily).  Analytics of the human dimension should include qualitative analytics—the logical analysis of data using small N analysis methods.

Recognizing the importance of assumptions about the system and data collection, and of the value of multiple sources of data, we arrive at the third principle.  Multiple perspectives are critical to obtaining a holistic and contextually relevant understanding of a system and potential solutions to problems.  In social sciences, there has been a welling of support for mixed methods approaches to research.  There are a wealth of books and articles published in the last 10 years advocating for the use of mixed method approaches to bear on a wide range of social science and public policy problems.[ii]  Further, in the field of monitoring and evaluation (M&E) of programs and policies, government and international agencies are regularly calling for mixed method research designs.  This calls attention to the value of multiple types of data and multiple types of analytics to develop a more holistic understanding of a problem, which will lead to greater confidence in findings and in more effective and enduring solutions to problems.  This approach lends substantial value to analytics of the human dimension for Intelligence and Defense requirements.

For instance, in M&E, surveys are seen as valuable for measuring whether a program is meeting the known goals of the sponsoring government agency and providing the agency the proof to show oversight bodies, but qualitative data collection and analytic methods enable the analysts to explore the causes of the success or failure of the program, and explore possible improvements.  This exploration is more difficult to do with quantitative data.  Furthermore, building a multi-level mixed method approach can lend substantially greater confidence.  Consider an evaluation of an information campaign that develops expectations for changes in organizational statements from relevant parties, sentiment analysis of social media influencers, and of the general population in the relevant region.  This design would provide reinforcing evidence of the impacts (or lack thereof) of the campaign at each of the levels of analysis.  Such designs provide the opportunity to gather reinforcing evidence of impacts, as each of these layers for the analysis influences the other.  Incorporating such a multi-level mixed method design in this simplistic example offers the potential to measure impacts, identify and diagnose shortfalls, and develop improvements to meet those identified shortfalls.  Well-designed mixed method approaches offer substantial advantages over simpler, more narrowly focused designs.

The challenge is to design mixed method approaches that fit within the time and resource constraints for the analysis.  There is no magic bullet here.  The only solution is to optimize the diversity of approaches and data sources within those constraints.  There is substantial value in widely varying approaches and data sources, over similar approaches or closely related data sources.  Combining large N surveys with field research or social media for data sources provides useful perspectives that can complement each other.  The key is to try to try to gain multiple perspectives and enable analysts to resolve the differences.

Summary

The power of analytics of the human dimension for Defense and Intelligence is profound, but we must clearly define what it is and ensure the correct principles are applied when conducting it.  Analytics is the logical analysis of data for understanding socio-cultural behavior, relationships, and dynamics.  It is built on the principle that careful and rigorous observation, in many instances, is our most powerful tool for understanding systems.  Additional principles are critical to the effective development and employment of analytics to solving problems.

  • Analytics must explicitly develop and recognize the assumptions about (1) the system of interest, and (2) the data collection process.  The act of observation and data collection is a process in and of itself that must be accounted for in designing the analytic process. Finally, the analyst must pair the analytic technique with the problem and assumptions regarding the system and data to be accurate.
  • Also, we must keep in mind that Analytics is not about the data. Analytics of the human dimension is about understanding the socio-cultural system of interest—this is not a ‘self-licking ice cream cone’.
  • Analytics of the human dimension can consist of quantitative and qualitative analysis techniques.  Both sides of this coin are useful and appropriate in different situations.  Observation provides the means of gathering evidence and insight into the system of interest, regardless of whether it is based on large numbers of cases (large N), or only a few cases or even a single case (small N).
  • Analytics of the human dimension should, whenever feasible, include multiple perspectives to provide more holistic and contextually relevant insights.  Towards this end, mixed method designs offer powerful advantages.

Following these few simple principles can enable most analysts to conduct effective analytics for Defense and Intelligence requirements in any given situation.  These principles will help to ensure that solutions account for hidden factors and make the most of resource and time constraints.  In the end, the objective is to provide enduring solutions to the most pressing problems.

References

Mixed Methods: Principles and Procedures by Janice M. Morse and Linda Niehaus, (Left Coast Press) 2009.

On Mixed Methods: Approaches to Lanuage and Literacy by Robert C. Calfee and Melanie Sperling, (Teachers College Press) 2010.

“Inside Insurgency: Politics and Violence in an Age of Civil War,” Perspectives in Politics, September 2007, Vol 5, No 3, pp: 587-600.

End Notes

[i] To this point, I highly recommend a book review essay from Political Science by Sidney Tarrow of Cornell University, “Inside Insurgency: Politics and Violence in an Age of Civil War,” Perspectives in Politics, September 2007, Vol 5, No 3, pp: 587-600.  In addition to reviewing four books on insurgency, Tarrow also provides an excellent review of the evolution of political science research in this area to include both qualitative and quantitative approaches. In fact, most social science methodologists refuse to discuss qualitative and quantitative approaches any longer, preferring to discuss the choice between small N and large N methodological approaches.

[ii] For instance, see Mixed Methods: Principles and Procedures by Janice M. Morse and Linda Niehaus, (Left Coast Press) 2009, as well as On Mixed Methods: Approaches to Lanuage and Literacy by Robert C. Calfee and Melanie Sperling, (Teachers College Press) 2010.

 

About the Author(s)

Michael L. Haxton, PhD, has worked in the defense and intelligence communities for 17 years advancing methods and tools to analyze the human dimension of conflict. He first joined the Joint Warfare Analysis Center in 1996 and then went to work for Booz Allen Hamilton in 2003, where he runs the firm’s Socio-Cultural Development Center and has supported a variety of Defense, Intelligence, and Civil sector agencies with social science-based analytics.

Comments

Ned McDonnell III

Wed, 09/04/2013 - 5:34pm

Michael,

As a non-quant, liberal arts type of guy, I find the humility and wisdom underlying your article to be refreshing. The challenge in this world of ours is to find people who are willing to think 'big' but to do so within the context of increasingly plentiful and granular (i.e., often overwhelming) data. Whether one likes it or not, big data are here to stay.

There is a dilemma here. Many of the qualitative types (a/k/a, the 'fuzzies'; e.g., me) are variously afraid of, and / or too lazy to bother with, all those data. On the other side, one encounters the data-dinks -- those sovereign idiots who know all the data, have all the answers and impart none of the insights.

In your second essay, you address very well this underlying and constant push-pull of inductive versus deductive reasoning by breaking the exercise of intelligence down into four basic levels of data-mongering and interpretation, while taking care to make explicit assumptions many do not realize they are making.

Trying hard to strike that difficult and precarious balance is, I believe, that oldest of all vocations: seeking wisdom. Perhaps that balance is the lost grail of disruptive thinking.