You are here

Multiphysics-based Sensor Fusion

Description:

OBJECTIVE: To develop new multiphysics-based sensor fusion algorithms which map disparate sensor fields into a single common multiphysics-based representation. This model can then be used to produce higher quality sensor fusion data products. DESCRIPTION: The proliferation of sensor systems has created a large volume of multi-sensor data across a number of physical fields (e.g., optical, EO/IR, hyperspectral, polarimetric, acoustic/seismic, RF, electromagnetic, mechanical, thermal, electrical, radiation). Classical methods of combining disparate data normally involve highly nonlinear mapping between actual measurements and underlying parameterized target model [1]. Furthermore, each sensor source needs to be statistically characterized for a good joint estimation (fusion) to be performed. Unfortunately, the fields (and consequently the resulting measurements) can be so dissimilar that traditional combining methods (e.g., extended Kalman filtering) [2] may perform suboptimally due to approximations/assumptions required to map into conventional algorithms. Recent advances in physics modeling and efficient software simulations have given rise to the emerging field of multiphysics modeling [3], wherein a single unified physics representation of objects of interest is developed. In multiphysics-based sensor fusion, all sensor measurements are first mapped into a unified multi-field physics model, which is then used to generate estimates of parameters of interest. For example, an IR/RF/polarimetric sensor suite (maybe employed in an urban warfare setting by law enforcement) could be used to develop a multiphysics model for a set of objects (e.g., targets of concern and clutter (non-targets)), potentially producing more accurate parameters of interest. Additionally, advances in high performance embedded computing (HPEC) [4] make it possible to execute a number of multiphysics models in real-time, an enabler for tactical multiphysics-based sensor fusion. PHASE I: Identify relevant technological applications (e.g., biomedical, automotive, aerospace, acoustical, geo-mechanical, RF, robotics, machinery monitoring). Develop baseline multiphysics simulation models encompassing selected measurement observables. Derive multiphysics-based sensor fusion algorithms and demonstrate effectiveness using synthetic data sets. PHASE II: Further refinement and development of the multiphysics models and companion multiphysics-based sensor fusion algorithms. Conduct high fidelity demonstration/validation of algorithm performance, based on finer grained simulations. Develop baseline embedded computing approach for meeting tactical timeline requirements for chosen applications. Quantify performance gains relative to conventional fusion algorithms. PHASE III: This research is applicable across the entire DOD ISR enterprise. REFERENCES: [1] D. L. Hall, and J. Llinas,"An introduction to multisensor data fusion,"Proceedings of the IEEE, vol. 85, no. 1, pp. 6-23, 1997. [2] S. M. Kay, Fundamentals of Statistical Signal Processing: Estimation Theory, 1993: Prentice-Hall. [3] D. H. Rogers, and C. J. Garasi,"Prism: a multi-view visualization tool for multi-physics simulation,"Proceedings of the Third International Conference on Coordinated & Multiple Views in Exploratory Visualization, pp. 85-95, 2005. [4] J. R. Guerci, and E. J. Baranoski,"Knowledge-aided adaptive radar at DARPA: an overview,"Signal Processing Magazine, IEEE, vol. 23, no. 1, pp. 41-50, 2006.
US Flag An Official Website of the United States Government