You are here

Automated Assessment of Urban Environment Degradation for disaster relief and reconstruction



OBJECTIVE: To develop algorithms that fuse observables from over flight operations (which could include panchromatic, IR, SAR, and MSI data) with data collected from ground sources (such as video from vehicles and drone cameras, LiDAR, and other sources), to automatically estimate the degradation of an urban environment due to battle or natural damage. 

DESCRIPTION: Assessing areas of urban damage due to wars, shelling, or natural disasters is currently a manual labor-intensive process. These processes result in estimates of degradation that are incomplete with low confidence, and typically operate at the scale of an entire city or village. Other approaches such as bomb damage assessment (BDA) look at single targets in order to determine a single object has been sufficiently degraded. Neither approach addresses the problem of time-dominant assessment required for disaster relief and emergency reconstruction purposes. The increasing availability of commercial satellite and aerial sources (such as unmanned aerial vehicles deployed by news sources) might permit rapid access to essential information to enable automated assessment of damage. In the same way, existing ground sensors, and ground-based collection from first responders can provide rapid response information. These sources can supplement overhead sources and provide more accurate assessments. It is desirable to exploit these sources of data as rapidly as possible. Among the output of interest is an assessment of whether roads and passages are traversable, and whether individual buildings and residences are safe enough for entry or habitation. Automated analysis of fused observables from overhead and ground data would enable 3D assessment that would lead to more accurate estimates with higher confidence at the level of individual buildings, sections of roads and passageways, as well as assessments of blocks, villages, and cities. 

PHASE I: Create a proof-of-concept demonstrating the feasibility of fusing observables from overhead imagery (panchromatic, multispectral, IR, SAR, MSI) and ground collection (full motion video--FMV, LiDAR, stills) data collected to generate an estimate of urban environment degradation based on 3D renderings of damage to structures. 

PHASE II: Develop a prototype system that demonstrates the improved accuracy of automated 3D assessments of urban environment degradation through fusion of multiple data sources. This prototype system should demonstrate the differences in accuracy of assessments using fused observables from both overhead and ground data collected compared to solely overhead or solely ground data, and show how multiple data sources improves accuracy compared to single source data. The prototype will be able to provide different levels of 3D assessments as real-world observables from overhead and ground collects may vary. 

PHASE III: These algorithms would be applicable for generating rapid estimates for assessing the extent of terrestrial battle space environment degradation due to air and ground operations, but would also be applicable to 3D automated urban environment assessment for application to disaster management, mitigation, and reconstruction. Damage from wide-area disasters such as hurricanes, tornados, and earthquakes, as well as floods and volcanoes, are facilitated by both overhead and ground data collected to yield an accurate estimation of how the environment has been degraded. These estimates would inform restitution efforts for prioritization and safe ingress/egress. Commercial efforts have largely focused on using solely overhead data or manual ground observation. Using fused observables from a broader range of both overhead and ground could save lives by speeding recovery efforts. 


1: Voigt, S. et al. "Rapid Damage Assessment and Situation Mapping: Learning from the 2010 Haiti Earthquake", Photogrammetric Engineering & Remote Sensing, Sept. 2011, p. 923-931.

2:  Dong, L., Shan, J. "A Comprehensive Review of Earthquake-induced Building Damage Detection", ISPRS 2013, 84, p. 85-99, Elsevier.

3:  He, M. et al. "A 3D Shape Descriptor Based on Contour Clouds for Damaged Roof Detection Using Airborne LiDAR Point Clouds", Remote Sensing 2016, 8, 189.

KEYWORDS: Automated Analytics, 3D Assessments, Fused Observables, Urban Environment Degradation, Automated Damage Assessment 


Jennifer Stoll 

(571) 557-6719 

US Flag An Official Website of the United States Government