Sensor overload

By Mr. Michael Pellicano and Ms. Danielle Duff, CERDECJanuary 31, 2018

GATHERING INTEL—THEN WHAT?
1 / 3 Show Caption + Hide Caption – A Soldier with the Regimental Engineer Squadron, 2nd Cavalry Regiment assembles an RQ-11 Raven unmanned aerial vehicle during a surveillance mission in May during Saber Junction 17 at the Hohenfels Training Area, Germany. Saber Junction is designed t... (Photo Credit: U.S. Army) VIEW ORIGINAL
GETTING THE BIG PICTURE
2 / 3 Show Caption + Hide Caption – Spc. Clayton P. McInnis, a human intelligence analyst with 1st Battalion, 155th Infantry Regiment of the Mississippi Army National Guard, reviews reports in the unit's tactical operations center in June, at the National Training Center, Fort Irwin, C... (Photo Credit: U.S. Army) VIEW ORIGINAL
ALL TOGETHER NOW
3 / 3 Show Caption + Hide Caption – An AH-64 Apache attack helicopter provides security while CH-47 Chinooks drop off supplies to Soldiers with Task Force Iron at Bost Airfield, Afghanistan, in June 2017. The Soldiers' mission is to provide accurate fires capabilities in support of Tas... (Photo Credit: U.S. Army) VIEW ORIGINAL

An S&T objective looks to 'ExPED'ite and improve processing and exploitation of the avalanche of raw intelligence data.

At the tactical level, a commander depends on the depiction of accurate and timely battlefield situational understanding on the common operating picture (COP) to support the decision-making process. This picture is directly influenced by intelligence analysts using an institutionalized workflow called tasking, collection, processing, exploitation and dissemination, which turns raw intelligence data into actionable intelligence that is then fed into the COP.

Over the previous decade, driven by the demands of war and technological advances, significant enhancements in the capabilities of sensors and collection platforms have led to collection systems that generate extraordinarily large amounts of data, which has the potential to provide a rich and more accurate understanding of the battlefield. Unfortunately, the wealth of data overwhelms analysts' ability to turn it into actionable intelligence. To put this in perspective, William M. Arkin writes in his book "Unmanned: Drones, Data, and the Illusion of Perfect Warfare," that "the amount of visual data collected each day [is] five seasons' worth of every professional football game played-thousands upon thousands of hours."

But drones are just one of the many sensors on the battlefield. Arkin notes that "the next generation of wide-area motion imagery sensors will be capable of collecting 2.2 petabytes per day, bringing 450 percent more data into the network than all of Facebook adds on a typical day." As a result, data is left unprocessed, unexploited and unavailable for future analysis. This inefficiency leads to gaps in situational awareness and sometimes duplicative collections.

The Defense Science Board in February 2011 came to a similar conclusion, stating: "[T]he rapid increase of collected data will not be operationally useful without the ability to store, process, exploit, and disseminate this data. ... Current collection generates data that greatly exceeds the ability to organize, store, and process it." There are not, and never will be, enough analysts to review the massive amount of raw intelligence collected on the battlefield.

To complicate this already difficult problem, the Army is consolidating analytic personnel, setting up centralized sites outside of conflict zones where specialized Soldiers can support operations by focusing on exploiting sensor data. However, legacy systems were not designed to move this amount of data across the network or support the collaborative analyst workflows needed to support decentralized processing, exploitation and dissemination (PED).

The Intelligence and Information Warfare Directorate of the U.S. Army Communications-Electronics Research, Development and Engineering Center (CERDEC), a subordinate organization of U.S. Army Materiel Command's Research, Development and Engineering Command, initiated the Extensible Processing Exploitation and Dissemination (ExPED) Science and Technology Objective (STO) in October 2016 to improve the process of converting raw sensor data into usable situational understanding. A STO is a three- to five-year critical science and technology (S&T) project that has direct oversight from the Warfighter Technical Council, a one-star-level governing body that addresses strategic program topics, recommending and reviewing major new S&T investment efforts. The STO comprises several research focus areas combined under one program to work collaboratively on high-priority Army capability gaps, which for ExPED is "developing situational understanding."

The program title, ExPED or Extensible PED, refers to the desired capability to adapt Army PED operations based on mission needs and available resources such as sensors, computers and human analysts. Under optimal conditions, tactically deployed intelligence analysts will develop and refine the intelligence COP by combining data from multiple organic and strategic sensors with the help of advanced processing resources and subject matter experts who may be distributed around the world. The tools used to perform PED must support these distributed workflows and also adapt to more constrained conditions where networks or limited timelines don't allow for an enterprise solution.

The ExPED effort began with an intensive effort to analyze and study the PED process, by observing and interviewing analysts to determine what architectures, systems and sensors exist in the tactical environment and how these capabilities are used to create intelligence products. The program then created a PED model and ran it through different scenarios to see where breakdowns might occur. Along the way, this effort identified the following top-level problems: the lack of automated processes to support multiple sensor and multiple intelligence (multi-INT) data; the high probability of missing significant events as data volume increases; and the lack of awareness of sensor collection plans.

With these findings, the ExPED team started an extensive system engineering process that identified basic PED use cases and then developed sequence diagrams to define how current PED processes functioned and to identify areas where applying S&T resources could have high payoff in PED workflows. The program designated three focus areas: PED architectures; data processing and analytics; and collaboration and visualization.

OPEN ARCHITECTURES IMPROVE INTEROPERABILITY

Recent combat operations necessitated focusing intently on immediate PED needs-narrowly targeted, evolutionary improvements, without appreciation for broader capability alignments; integration into the intelligence, surveillance and reconnaissance (ISR) enterprise; or life cycle cost. For the sake of speed, new sensor systems were developed and fielded as stovepiped systems, each with a dedicated processing system and dedicated analyst. This allowed for faster design, development and testing, whereby the engineers controlled all aspects of the system. In addition, sensors and PED systems are stovepiped within security boundaries because of classification of the systems or the data they collect. However, valuable information could be shared across security boundaries if the proper processes were in place.

These stovepipes hinder the ability to conduct multi-INT analysis, to hand off targets between sensors (cross-cue) or to share data with other systems. Stovepiped systems also present unscalable and unsustainable costs for the doctrine, organization, training, material, leadership and education, personnel and facilities aspects of maintaining the ISR enterprise.

Instead, sensor solutions need to use industry standards, be scalable-capable of handling a growing amount of work-and built on open architectures designed to support rapid integration of new capabilities by making it easy to add, upgrade and swap components. These architectures should adapt to the echelon in which they will operate, provide a framework for distributed PED and facilitate integration with other systems.

Data services, an essential architectural component, must provide data management and delivery to the right user; this includes enabling access to joint, interagency, multinational, NATO, allied and national operations. Some currently fielded sensor architectures provide sensor data and status. However, these architectures are not tailored for tactical environments with limited communications, cannot be easily reconfigured during missions and are not designed to support multi-INT fusion-the process of comparing and correlating data from multiple sources and disparate types, including human inputs, collected signals, measurements and imagery, and then generating more useful observations.

The ExPED program is investigating and developing sensor architecture prototypes that will dynamically tie together PED resources (sensors, analytics and analysts) across the tactical space. This will provide the ability to reconfigure resources in changing conditions and make better use of constrained tactical bandwidth, thus increasing awareness and discovery of significant events.

REDUCING ANALYST WORKLOADS

The Army continues to add sensors that are capable of collecting greater volumes of data, but we can't afford to move all of the data around our networks, and we don't have enough analysts to look at all of it. Analytics provide process automation, smart logic, computation and threat trending that expose nuggets of relevant information to the analysts.

Taskable automated and semiautomated multi-INT analytics and processing-whereby the user (or multiple users simultaneously) can seek and detect particular features for a particular mission or at a specific time, for example, a red truck or people with white shirts-are needed closer to the sensor to increase the Army's ability to manage and exploit the breadth and scale of collected data. Distributed data processing-using multiple computers across different locations to divide the processing load-can help reduce the amount of network traffic by filtering and compressing data as it moves through the network, increasing system performance in bandwidth-limited environments. These capabilities will create opportunities to leverage remotely stored data to glean new insights.

ExPED is investing in the development of prediction, fusion, correlation and alerting capabilities that are critical to managing the big data challenge and are necessary to reduce an analyst's workload. ExPED is working with Army and industry stakeholders to define standards for analytic interoperability so that more sophisticated mission-specific solutions can be built from existing analytic toolsets.

To validate these standards, the ExPED program is developing multi-INT analytics to merge radar tracks, full-motion video and electronic signals to provide greater confidence in the data and lessen the time for alerts to significant events. The analytics also need to be scalable and extensible so that the user can execute them wherever it makes sense across the tactical space. For example, an analytic can run on a multisensor platform, ground station or sanctuary, depending on the mission's concept of operations and communications links.

COLLABORATION AND VISUALIZATION

As the Army moves more toward centralized PED sites, collaboration is going to be all the more important. The Army has been realigning how it organizes and employs its human analysts as part of the PED process. One idea is setting up centralized sites outside of conflict zones where specialized Soldiers can focus on exploiting sensor data and feeding situational awareness back to theater. However, bandwidth constraints will limit scalability of this solution. Additionally, analysts who are not on the ground lack the mission context to fully exploit the data.

Reliance on the current system of countless chat windows to collaborate is inefficient and not scalable. Therefore, the Army requires a solution that allows for PED operations to move seamlessly between tactical and remote PED analysts.

Usability and software interface design are critical for handling, filtering and understanding the data and analytics, as well as providing an environment for analysis and user collaboration. Development and integration of techniques for big data visualization, collaboration and workflow management are essential for common understanding. These tools will enable management of tasks across echelons, provide mission context to facilitate situational understanding and reduce cognitive burden on analysts.

The ExPED program is developing a sensor COP to support all parts of the PED process, from tasking sensors to exploiting data to use of the intelligence. This includes developing an interface that is tailorable to all users in the PED process, including mission managers, exploitation analysts and analysts at every echelon. The ExPED sensor COP is a shared collaborative environment where all parties can interact and conduct their respective tasks and workflows-in real time, if communications allow.

The ExPED sensor COP is extensible, allowing applications to be built into it. This will allow data to move from one phase to the next with collaboration along the way, and will task and automate processes effectively to reduce analyst workload.

CONCLUSION

Current Army PED operations are not extracting the maximum amount of intelligence from existing sensors. The Army can get additional value by better leveraging the opportunity for multi-INT processing and exploitation, cross-cueing between sensors, forensic analysis and increased awareness and use of available resources.

The S&T community has the opportunity and imperative to work outside the narrow bounds of acquisition programs of record in order to design and demonstrate standards-based interoperable systems. By implementing a common framework of interoperable PED components, such as those being developed and demonstrated under the ExPED STO, Army PED operations will realize improvements in efficiency and capability such as:

- Moving processing closer to sensors to improve the timeliness of actionable intelligence and reduce the bandwidth necessary to transmit raw data.

- Automated or semiautomated cross-cueing of sensors for faster target acquisition and tracking.

- Use of advanced analytics to increase the speed and effectiveness of extracting intelligence from high-volume and high-speed sensor feeds.

- Better leveraging of distributed sensors, processing systems and analysts to execute ISR missions.

Commanders rely on situational understanding to make timely decisions, but more data does not equal situational understanding. Understanding will be accomplished only by providing analysts with the tools to process, exploit and disseminate the extensive amount of sensor data collected across the battlefield.

For more information or to contact the authors, go to www.cerdec.army.mil.

MR. MICHAEL PELLICANO is a lead engineer in the CERDEC Intelligence Systems and Processing Division. He holds an M.S. in electrical engineering from Stevens University, and an M.S. in business administration and a B.S. in electrical engineering from Penn State University. He is Level III certified in engineering and is a member of the Army Acquisition Corps (AAC).

MS. DANIELLE DUFF is a senior engineer who oversees the research portfolio for CERDEC's Intelligence and Information Warfare Directorate, Intelligence Systems and Processing Division. She holds a master of electrical engineering from the University of Delaware and a B.S. in electrical engineering from Virginia Tech. She is Level III certified in engineering and in test and evaluation, and is a member of the AAC.

This article is published in the January - March 2018 issue of Army AL&T magazine.

Related Links:

U.S. Army Acquisition Support Center

Army AL&T Magazine

U.S. Army Communications-Electronics Research, Development and Engineering Center (CERDEC)