Radar Data Fusion for Single Integrated Air Picture (SIAP)-Data Fusion and Registration (DataFusR) System

Award Information
Department of Defense
Solitcitation Year:
Solicitation Number:
Missile Defense Agency
Award Year:
Phase II
Agency Tracking Number:
Solicitation Topic Code:
Small Business Information
Beeches Technical Campus, Rome, NY, 13440
Hubzone Owned:
Woman Owned:
Socially and Economically Disadvantaged:
Principal Investigator
 Andrew Drozd
 Chief Scientist
 (315) 334-1163
Business Contact
 Andrew Drozd
Title: President/Business Owner
Phone: (315) 334-1163
Email: adrozd@androcs.com
Research Institution
Real-time fusion of data collected from a variety of radars that acquire information from multiple perspectives and/or different frequencies is being shown to provide a more accurate picture of the adversary threat cloud than any single radar or group of radars operating independently. In the ground midcourse problem, which involves the acquisition, tracking, and discrimination of multiple ballistic targets, it is important to distinguish between legitimate targets of opportunity and the threat cloud (clutter due to chaff, decoys, or other objects), and be able to consistently and reliably track the true target(s) during their midcourse trajectories. Hence, a “batched” detection approach using “cooperative” radars together with applying distributed, decision-level algorithms is key to resolving closely-spaced objects within an expanding target object map area for the purpose of tracking legitimate targets in a continuous manner. The spatio-temporal-spectral characteristics of the problem can be exploited here to provide important strategic advantages. Also, it would seem that significant advantages could be gained by exploiting sensors of different modalities; such as overhead infrared (IR) and electro-optic (EO) sensors in addition to ground-based radars operating at different frequencies. Data fusion would require novel schemes of finely aligning (registering) and fusing multi-modality inputs to enhance the net accuracy of automated target detection, acquisition, tracking, and discrimination/recognition systems. This effort is to update and further develop the Phase I DataFusR track fusion algorithms, software, and hardware concept designs necessary to collect, process, and fuse information from multiple radars (either at the same or different frequency) to form a SIAP, and demonstrate the prototype technology in a realistic environment first using data from multiple radars, then show its ability to work in real-time in a high clutter environment. The approach will be extended in this effort to include other sources such as next-generation radars, IR and EO sensors to provide a versatile multi-modality sensor fusion capability to thoroughly address the present ground midcourse problem. This effort will concentrate on developing and applying multisensor track fusion algorithms to accurately characterize the threat cloud and provide a more definitive means of distinguishing between true target tracks and clutter object tracks. Technologies that enable this synergistic fusion and interpretation of data at several levels from disparate GMD radars and other types of sensors will enhance system acquisition, tracking and discrimination of threat objects in a cluttered environment and provide enhanced battle space awareness. In order to accomplish this, the generalized solution recently developed in Phase I will be extended, which adapts an intelligent multi-source data fusion (MSDF) simulation scheme for automatically selecting the most appropriate multi-target tracking, registration/fusion, and clutter rejection algorithms in forming a SIAP.

* information listed above is at the time of submission.

Agency Micro-sites

US Flag An Official Website of the United States Government