Small Wars Journal

ABORT, RETRY, FAIL? Fixing Army Software

Fri, 11/01/2013 - 10:44am

ABORT, RETRY, FAIL?  Fixing Army Software

Crispin Burke, James King, and Niel Smith

During an otherwise uneventful Congressional hearing this past April, US Army Chief of Staff General Ray Odierno lambasted Representative Duncan Hunter (R-CA) over criticism of the US Army’s intelligence processing software, known as DCGS-A (Distributed Common Ground System-Army). 

General Odierno’s outburst was the latest salvo in an ongoing, bitter, and convoluted debate over the US Army’s suite of Army Battle Command Systems (ABCS)—a slew of computer systems designed to help the US military better communicate, coordinate, and consolidate information. 

On one hand, the Army’s senior leaders have vehemently defended DCGS-A, a program which has already cost taxpayers over two billion dollars.  Gen. Odierno, during his Congressional testimony, even went so far as to claim that DCGS-A gives today’s captains on the ground access to greater intelligence than he had as a division commander just ten years ago.      

But Rep. Hunter has taken a very different view, one which echoes the growing frustration from troops in the field over DCGS-A.  For Soldiers in Afghanistan, DCGS-A is slow, cumbersome, and awkward.  As a result, many units have become so frustrated with DCGS-A that they’ve turned to an off-the-shelf solution from a Silicon Valley-based software company, Palantir.

In fairness, both Rep. Hunter and Gen. Odierno are right.  Indeed, Palantir has proven to be a simple, effective software solution for tracking insurgent activity in Afghanistan—so effective, that Army leaders have considered incorporating Palantir into DCGS-A.  But while Palantir could supplement DCGS-A, it could never fully replace it.  Indeed, despite DCGS-A’s unwieldy design, DCGS-A is still far more capable than Palantir ever could be.

So why the vitriol?  Much of the debate arises from the generational divide between the generals who promote and purchase these systems, and the troops in the field who actually use them.

Like many service members of his generation, Rep. Hunter—a Marine who served in both Iraq and Afghanistan—is a “digital native”, having paid his way through college creating websites and databases for high-tech companies.  By contrast, most generals would have been in their twenties by the time they encountered their first personal computers.

Many assume that today’s tech-savvy troops would be comfortable with the Army’s Battle Command Systems.  But while programs like DCGS-A, Command Post of the Future (CPOF), and Movement Tracking System (MTS) may be capable, troops generally deride their atrocious user interface, and poor, almost non-existent interoperability.

Of course, no general wants their Soldiers to struggle with a poor product.  However, senior officers generally have little involvement in developing these systems.  As a result, generals have not personally vetted their ease of use, either through hands-on use or in-depth briefings with the soldiers who will use the system.  As a result, these programs move forward on the sole basis of the end capabilities they deliver, despite the challenges troops face in fully implementing them. 

What does this mean for the modern battlefield?  In practice, we have found that some Soldiers have been so frustrated that they choose not to use these programs, and in some cases, have created their own workarounds.  In other words, the DCGS-A’s interface is so poor, that troops often elect to fight without one of their most valuable tools. 

Interface: KISS

Poor user interfaces cost the Army time and money.  Contractors offer training courses to certify operators Battle Command Systems (for a fee, of course).  Depending on the software system, these courses can last anywhere from 40 to 120 hours of classroom instruction to teach the basic functions of DCGS-A.

The steepest learning curve for Soldiers lies in simply fighting the user interface.  In general, most Battle Command programs are counter-intuitive to Soldiers who grew up with the functionality of Microsoft Windows.  For example, CPOF, which is found in nearly every tactical operations center, is based on UNIX, meaning that most functions run completely counter to those of Windows.  Worse yet, there is little commonality among systems:  the skills required to master CPOF are of little use when trying to navigate DCGS-A or MTS.

Compounding the problem is that maintaining trained personnel in these systems is a Sisyphean task for many commanders.  The high personnel turnover inherent in Army units constantly leaves them short of trained operators.  The hours invested in training a competent operator often encourages commanders to hold trained personnel in positions longer than prudent for their own professional development, with negative consequences for promotion. 

In sum, the poor utility of ABCS systems means that Soldiers simply cannot get the greatest tactical advantage out of these billion-dollar systems.  Navigating awkward and often backward commands and menus hinder their ability to provide commanders with timely and relevant information.  Indeed, Soldiers spend an unacceptable amount of time fighting the very systems designed to produce the information commanders need to fight and win the battle.

A system of systems, each at odds with one another

Worse yet, many operators find it difficult, if not impossible, to get the Army’s various Battle Command Systems to communicate with one another.  Intelligence analysts often have to be creative when trying to send and transfer data, particularly since the DOD has disabled USB ports and restricted the ability to burn CD-ROMs.  The difficulties have so thoroughly vexed some intelligence analysts, that many have resorted to manually inputting and analyzing data in Excel and Power Point in order to more easily distribute intelligence, bypassing the Army’s billion-dollar DCGS-A program altogether.

Fortunately, there is light at the end of the tunnel.  During a recent training exercise at the National Training Center (NTC), a Colorado-based Army brigade was able to successfully disseminate graphics and intelligence from DCGS-A to CPOF, the Advanced Field Artillery Tactical Data System (AFATADS), and Force XXI Battle Command Brigade and Below (FBCB2)—a first for any unit at NTC.  The brigade, the 2nd Brigade Combat Team of the 4th Infantry Division, credited their success to months of coordination between their intelligence and communications section.

But while the 2nd BCT’s success is certainly commendable, they should not have had to work so diligently in order for their various software programs to collaborate and share data.  The sheer inefficiency of these systems is unfathomable in the year 2013, in which most teenagers have little difficulty cross-posting between competing programs such as Twitter, Facebook, and Tumblr. 

Recommendations

Why does the Army pick such bad systems?  It certainly isn’t malicious intent on the part of the Army or its supporting contractors.  Rather, the generational divide between generals and troops in the field, can lead to conflicting ideas of what makes a good software program. 

The current practice of spelling out software requirements without associated usability metrics contributes greatly to the problem.   As long as the system performs the functions required in the contract deliverables, the vendor is in fulfillment of his contract.  Rarely are usability metrics related to usability included in the mix.  To that end, we suggest the Army’s proponents consider the following:

Establish a Common Army user interface.  One of the most frustrating aspects for soldiers is the lack of standardization between programs.  Most commercial desktop software utilizes some version of Microsoft’s standard interface, which places common commands in the same places, regardless of program.  Similar schemes are present in Google’s Android OS and Apple’s iPhone software, allowing users to quickly become productive with new programs and hardware with a gentle learning curve.  The Army should adopt and mandate a common user interface built around commercial software conventions.   When possible, the Army should also license and modify commonly used off the shelf commercial programs rather than design new interfaces. We have seen success with this in programs like the very popular TIGR software, whose intuitive interface was rapidly accepted by soldiers.

Require usability metrics as part of the contract deliverables.  Future program revisions should include clauses which mandate use of common interface heuristics.  Metrics should assess the number of dialogues, steps, or clicks required to perform basic tasks, receive help, or conduct file operations.  Contractors must conduct extensive beta testing with Soldiers, and Soldier feedback should be incorporated into the finished product.  

Code the software using open concepts, and design the software for user modding.   Army software is often locked into proprietary specifications of the vendors.  However, the commercial software and gaming industry often encourages add-ins and “mods” to their software that enhance the user experience.  In many cases, these mods are incorporated into later versions of the programs.  The Army should leverage the ingenuity of its tech-savvy soldiers to constantly improve and refine the systems they operate. 

Design software to export information into Microsoft Office formats.  Love it or hate it, but the Army runs on MS Office, especially PowerPoint.  Information from battle command systems must export seamlessly for presentation to senior officials.  Soldiers often spend countless hours manually transcribing information out of ABCS systems and into presentation-friendly formats that senior commanders demand.

Conclusion

The DCGS-A/Palantir debate unintentionally highlighted the problem of usability in major DoD software systems.  Soldiers overwhelmingly reject equipment and systems which are difficult to employ.  The Army’s must refocus on usability if it truly wants to maximize the capabilities of the ABCS systems and its successors.  Program managers must hold software designers accountable and provide clear incentives to build usability into each platform from the ground up, rather than as a design afterthought.   Failure to do so results in increased costs, unnecessary training hours, and frustration on the part of those tasked to use the systems on a daily basis.

About the Author(s)

Major Crispin Burke, Major James King, and Lieutenant Colonel Niel Smith are serving Army Officers with extensive experience using Army information systems in garrison and combat.  The opinions in this article are their own and do not reflect the position of the Department of Defense or the U.S. Government.

Comments

Tueret19761

Wed, 01/18/2023 - 2:01am

I share you some good information that my friend's brother wants to start his own business. He asked me for help with this. I shared a great post with him from here. In this post, the smallest information about opening a business was given. After that post, I came to know about chat gpt 3 technology. I told about this technique to my friends. Which they used.

Youl19701

Tue, 01/17/2023 - 6:33am

My friend told me about AI technology, it is such a technology that you can earn a lot of money. If you want to learn something about this technique then you can read this post. Here you have also been taught to use the chat gpt 16 site. Due to this, you get ease in using everything.

kofinsoyameye

Sun, 11/03/2013 - 6:00pm

I am extremely surprised that the Army should have such technical problems with the DCGS-A, in today's Service-Oriented Architecture (SOA)-Cloud Based Systems-of-Systems (SoS). What happened to the DOD's Net-Centricity concepts which were supposed to fix all the DOD's stove-piped SoS? I already have such an architecture in International Test and Evaluation’s (ITEA's) journal which the Army can follow to fix the design and implementation of such a system -- DCGS-A -- as part of the overall DOD’s Net-Centric SoS! The title of the paper is "Methodology for Designing and Evaluating Reliability of the DoD Net-Centric Ecosystem." If the Army follows the "Net-Centric Warfare theory" as the technical basis for designing all its future SoS, such problems with DCGS-A should never occur! The Warfighter, not the contractor, should define the functional requirements of every DOD’s SoS! More importantly, the Warfighter should be part of the design, testing and evaluation of any Army's new SoS, before the SoS is finally deployed on the battlefield! Such a new thinking of the Warfighter's involvement throughout the design, testing and evaluation of any new SoS, is the main reason the Office of the Secretary of Defense (OSD) has embraced ITEA's new model!

USMC MAGTFerist

Mon, 11/04/2013 - 10:28pm

In reply to by longlostfromnh

longlostnh,

Thanks for your comments as well. You're obviously knowledgeable on the roots of CPOF...they had a concept they advertised called topsight or something, which referred to the ability to visualize a tremendous amount of information. UI design is critical and it often fails in providing the user some simple "paradigms" to view information. The display must be rapidly changeable and provide the user good cues on where he is and where he came from...I liked OWF because of the concept it enabled: essentially you had a way to deliver applications from various programs into a somewhat unified whole, while providing some tools to deliver new applications and collect metrics on application use. I'd like to hear your views on how we achieve those goals using a better approach than OWF. Thanks again.

longlostfromnh

Sun, 11/03/2013 - 8:51pm

In reply to by USMC MAGTFerist

True, there was only ever a "Windows" version (of its client and server), but the vendor could easily have made the client available as a Java WebStart (CPOF is implemented in Java) - available to run on any OS; however the government chose not to fund this and Maya Viz was then bought by GD, which wouldn't write a line of code without getting money from the government. The authors, in their assertion that CPOF is based on "Unix", I believe were (poorly) trying to depict it's user interface/user experience which (mostly) lacks menus and provides a new user-centric paradigm, as you state in your comment. If one were to say what CPOF was based on, the only answer would be the work of Dr Steve Roth, a cognitive psychologist who taught at CMU.

Thank you for depicting everything I was thinking as I read this article, except for your promotion of OWF. Even if it is open source, it's a pretty horrible technology.

USMC MAGTFerist

Sun, 11/03/2013 - 3:47pm

I've got to take issue with some of this article. First, CPOF was not introduced as a program of record. It was developed by a small company named Maya Viz, which was purchased by GD in 2005. It is also not "based on Unix." There was never a Unix/Linux or any version of it besides Windows. CPOF was universally acclaimed due to its powerful visualization tools and intuitive user interface. When I ran training for CPOF, we could teach a basic user in less than a day, an advanced user in two days, and a sysadmin in a week. This was a huge improvement over systems like MCS or the Marine Corps' C2PC application. CPOF was in many ways the "insurgent application" that upset the apple cart for the programs of record as Palantir is now. The authors were probably still in high school, college or OBC back then.

I recall we did extensive testing of ABCS (version 6, I think) during Millennium Challenge 2003. At that time, the Army was trying to create a back-end component called the Joint Common Database (JCDB), which certainly was neither joint nor common. That was a failure. It was obvious even then that the days of monolithic applications like MCS and ASAS were numbered as new approaches to application development and UI design (fostered by the Wild West atmosphere of the Internet) were gaining popularity and notice. What this led to was a fairly broad divergence in UI design, but also many improvements in what I call the user paradigm. CPOF was, at the time, the best example of a new, highly effective user paradigm, much as perhaps Palantir is now.

The authors' suggestions are old ideas and they would have little impact on the identified challenges. The idea of establishing a common Army user interface is laughable...how about a common NATO interface or a common DoD interface or even a common Army-Marine Corps interface? It sounds great, but it is impossible to accomplish and it would be a disaster if we tried. Forcing every software application to conform to a rigid set of UI design guidelines would simply leave us where we were in the mid-90s...with horrible UIs that were difficult to train. Better to allow sufficient freedom so that "insurgent applications" aren't blocked due to lack of conformance with the rigid UI standard. Somehow all the current generation of young soldiers and Marines manages to use a wide variety of UIs in their digital lives, yet no one has made a dime trying to teach Twitter or Facebook to them (because they don't really need it).

As far as UI metrics, that's a real challenge. Much of it depends on the complexity of the underlying application and particularly on the processes it supports. Take AFATDS for example - it provides technical and tactical fire direction, which is a very complex set of functions that artillerymen spend years becoming proficient at. How do you put that into a "Twitter"-like UI? So UI metrics would be a huge challenge. Better to foster competition among application developers and let the cream rise to the top.

I've been on both sides of this issue as a user and systems engineer on various C2 programs, and I've seen where the vision has become disconnected from the end result. As frustrating as that is, there are some common reasons that it occurs. A big reason is that the teams developing applications are simply too large, but with too few high performers on them (in UI design, SW engineering, systems engineering, testing, and program management). There is no way our current acquisition process will allow us to select high-performing teams and consistently weed out the average or below average teams. Another reason is that programs are often underfunded, which means that program managers must make cuts in areas such as UI design to ensure that the program delivers something to the users. When your job is on the line, you're going to deliver what is absolutely required first before you worry about the "bells and whistles" (which is where UI design often falls).

As far as designing apps to export into MS office apps, it was a nice idea 10 years ago, but we're really beyond that now. In fact, CPOF was praised because we could brief a BUB using it without needing to dump screenshots into PowerPoint. We want data to be reusable as data...once you dump something into PowerPoint, then it is just pixels on the screen. Nothing you do in PowerPoint or Word can really be reused as data in any other application. Excel is the one exception in that it is a good way to visualize data, but how do you bring your value-added work back into an application without some control of the transformation process. Now think about where we're headed in this decade--towards mobile devices. Do you want the MS Office paradigm to follow us onto mobile devices? I know that I don't. You're trying to fix the problems of the last decade, not this one.

Designing for "user modding." That's a tough one...it sounds great, but how do you ensure that the user mods don't affect the underlying security of the application or the data it hosts? There has been a lot of effort to make applications more extensible, but the role of well-trained application developers, testers, and systems engineers can't be discounted. Where we could really improve is in requirements management. So many programs are crippled out of the starting gate do to lots of inexperienced people with lots of bright ideas generating requirements that add cost and do little to provide the functionality needed by the majority of users. CPOF has had this struggle ever since it became a POR.

With the exception of the DCGS UI, I'll add that the Army is actually doing a pretty good job. Even the DCGS backend is using well-known open source applications like Red Hat Linux, Hadoop and Apache Tomcat. The Army's effort to deliver apps through the Ozone Widget Framework is also a big step forward. But understanding how important that is requires a good understanding of the challenges facing PMs at PEO C3T and other like activities in the other services. If they were equipped, supported, and incentivized to deliver better UIs, then we'd probably get better UIs. In other words, PEO C3T is getting a lot right in spite of the fiscally bizarre environment we're operating in.

Bottom line: the article presents well-known problems and recommends old ideas that really don't work. Time to catch up and think about recreating the competitive environment of the commercial application development ecosphere, so that we can recognize and reward high-performing companies and application development teams. We also need to toss MS Windows in the dustbin and use as many open-source components as possible at every layer.

Tim Wilson

Sun, 11/03/2013 - 2:45pm

Nailed it... DCGS has significant issues dating back to the days of ASAS (All-Source Analysis System). In fact when DCGS came out it initially operated off old ASAS software renamed and repackaged into DCGS.

The support tail/maintenance is another issue; the system is designed to require contractor support. Responsiveness by the PM to get bugs fixed or software added is slow or next to non-existent. There are too many contract company's involved to the point where there is proprietary bickering.

The complexity of DCGS is overwhelming. Individual toolsets on the system are excellent but require a phd to operate. Interoperability between the different software on the system and other ABCS systems is spotty at best. Possibly a result of multiple contract company's for various parts of the system. There is an enormous disconnect between the program engineers and the users of what actually works in the field opposed to what works in a canned environment.

There is no way ahead on how the Army as a whole wants to employ this system. When it is fielded there is no standard. I've seen three different fieldings and all were different. But one thing is consistent is you will hear the party line "based on your unit SOP". In code means we don't know jack about applying the system and can only teach you what each button does.

Cavguy

Fri, 11/01/2013 - 12:03pm

In reply to by Gian P Gentile

Sir,

Will shoot you a note. been too long.

Gian P Gentile

Fri, 11/01/2013 - 11:49am

Fellas:

Nice piece, it explained a lot of things that i was unclear on after following the disagreement between General O and Rep Hunter.

And wait a minute, Niel Smith is now Lieutenant Colonel? (congratulations!) Boy, where does the time go?

gian