You are here

Wide Area Monitoring and Alerting: Visualization for Real Time ISR Sensemaking and Situational Awareness


TECHNOLOGY AREA(S): Human Systems 

OBJECTIVE: Accelerate utilization of autonomous tools in analytic environments by designing and building a prototype decision support systems that enhances Sensemaking and Situational Awareness (SA) for real time wide area Intelligence, Surveillance, and Reconnaissance (ISR) missions. 

DESCRIPTION: Sensors and sensor networks are increasingly built on an assumption that the Intelligence Community will “collect everything and make sense of it later”, an approach that contrasts sharply with classic ISR methods where collection is focused on specific targets of known or suspected intelligence value. The resulting Big Data generated by modern collection methods are often beyond the capacity of individual analysts or analyst teams to comprehensively monitor or make sense of using traditional analysis methods, particularly in real time or near real time. In such operational environments, efficiently managing the analyst’s limited attentional resource is critical to enabling future operational constructs, an issue that is consistently overlooked by present day ISR capability developers. This effort aims to provide explicit decision support for attention direction and management by designing analysis-focused displays that enhance the prioritization, triage, and sensemaking activities required to establish and maintain SA, at scale, for wide area monitoring-focused ISR missions. This approach builds on an increasingly widespread recognition that effective data representation and visualization – and not simply making more data available – plays a critical role in mission success: “…performance can be improved by providing displays that allow the observer to utilize more efficient process of perception and pattern recognition instead of requiring the observer to utilize the cognitively intensive processes of memory, integration, and inference.” [1] “Analysts must pair their intuition and expertise with data science to give decision-makers the best possible intelligence.” [2] This proposal seeks to develop a next-generation wide-area monitoring concept. A successful ISR decision support capability will: (1) Represent fused data in “operationally meaningful” frames of reference that are defined with respect to mission objectives. “Universal” representations such as maps, timelines, lists, etc. serve as supporting views rather than primary user interfaces. (2) Support initial analysis at a meta-data level. Basic units of data should be interlinked data elements such as events, patterns, or narratives. Data at the level of “eaches” (e.g., an image, a signal) are available by inspection, rather than directly represented. (3) Integrate analytics that amplify analyst sensemaking by pre-processing data beyond what an unaided analysts can perform. Robust human-machine teaming principles should be employed when integrating these technologies, including providing support for observability and direct-ability of automated system components. (4) Communicate active coverage as well as coverage gaps via visual encodings. (5) Provide operators with a mechanism to change or refine parameters of meaningfulness over time; systems should learn and adapt performance based on this user feedback. Continued progress toward rich human-machine collaboration that enhances the performance of the overall ISR system depends on deepening the research base for how to effectively team analysts with advanced decision support analytics; this proposal for research represents a concrete response to this need. The resulting technologies should support ISR missions across the Air Force by enabling Airmen to actively exploit increasingly large and dynamic data streams. The developed concepts and capabilities should also have applicability to industry environments where real time monitoring and analysis of wide area data streams are critical. No government furnished materials, equipment, data, or facilities will be provided. 

PHASE I: Develop wide area monitoring display concepts that mitigate limitations of contemporary data visualization and fusion techniques that: (1) fail to capitalize on data-driven reorientation opportunities enabled by wide area collection; (2) handle scale poorly and disrupt primary workflows; and (3) are difficult to direct or re-task as the mission environment changes. Phase I shall also include an experimental evaluation of Sensemaking, SA, and related measures. 

PHASE II: Focus on rapid design and demonstration of a human-machine teaming concept that overcomes, rather than simply mitigates, wide area monitoring data overload challenges. Promising concepts from Phase I will be identified based on results of Phase I assessment as well as expert analyst feedback. These concepts will be developed into a software prototype that visualizes data from one of three specific domains of interest: SIGINT, Motion Intelligence (WAMI, GMTI), and/or OPIR. Phase II shall include a mission-focused evaluation of capabilities based on representative mission data. 

PHASE III: Extend successful Phase II decision support capabilities to integrate data from multiple wide area sources. This multi-INT prototype shall monitor and organize background data with limited direct tasking from analysts. In addition, interface and interaction concepts should support exploitation of data types beyond that of the Phase II prototype. 


1: Bennett, K. B., & Flach, J. M. (1992). Graphical displays: Implications for divided attention, focused attention, and problem solving. Human factors, 34(5), 513-533.

2:  Brown, J. M. (2017). The Data-Driven Transformation of Intelligence. The National Interest. Source URL (retrieved on February 25, 2017):

3:  Woods, D. D., & Hollnagel, E. (2006). Joint cognitive systems: Patterns in cognitive systems engineering. CRC Press.

4:  Air Force ISR 2023: Delivering Decision Advantage.

KEYWORDS: Decision Aiding, Decision Support System, Situation Awareness, Sensemaking, Monitoring, Alerting 


Dr. Taylor Murphy Ph.D. 

(937) 255-8814 

US Flag An Official Website of the United States Government