You are here

Radar Image-Based Navigation

Description:

TECHNOLOGY AREA(S): Electronics 

OBJECTIVE: To mitigate the impact of Global Navigation Satellite System (GNSS) -denied navigation and improve airborne platform inertial (attitude) estimates, develop navigation techniques based on the correlation of high resolution radar imagery to previously-collected radar imagery, optical imagery and/or digital terrain elevation databases. 

DESCRIPTION: Terrain reference has been employed for precision navigation for centuries, using visual aids such as significant topographical features and landmarks to establish one’s current location. In a more recent form, e.g. Terrain Contour Matching (TERCOM, [1]), a technique pre-dating GNSS, compares optical features or a sequence of terrain height measurements to databases to determine a platform’s current position. For the latter approach, the system relies on terrain height databases such as Digital Terrain Elevation Data (DTED) as fiduciaries. TERCOM systems are known to perform poorly in areas where there is little / no terrain relief and/or salient optical features depending on the specifics of the implementation. Synthetic aperture radar (SAR) and real-beam imaging can deliver nearly optical quality, all-weather, day-night imagery in two dimensions [2]. Combined with elevation degrees of freedom, these systems can produce interferograms, which can then be used to yield terrain relief estimates as well [3]. By correlating radar imagery to existing optical or, in the case of three-dimensional, DTED databases, imaging radar can assist in all-weather / day-night navigation. 

PHASE I: Identify both radar and reference data to be used to support image-based navigation studies. Develop navigation error models with the appropriate degrees of freedom. Establish quantitative relationships between the quality of reference imagery, the resulting registration (e.g. misalignment), and associated navigation errors. Through analysis and empirical studies using existing radar imagery, establish under what conditions image based navigation works effectively and when it fails. Summarize the performance of the technique under conditions in which the model(s) was (were) tested. Develop plans for a Phase 2 demonstration on operationally relevant imagery. 

PHASE II: To demonstrate the efficacy of the capability in previously-untested environments, develop, in C/C++, MATLAB or similar prototyping software, a near-real-time image-based navigation implementation(s). Identify a radar system capable of producing the necessary radar imagery. Informed by results and lessons-learned in Phase 1, develop and execute test plans utilizing the radar to collect data. Demonstrate the algorithms’ efficacy on data collected by the system in near-real time. 

PHASE III: Implement and integrate an RF image-based navigation algorithm for real-time use on an operationally relevant real-beam or for SAR-based imaging system. Develop and execute test plans demonstrating the efficacy of the algorithm in an operationally relevant environments. Develop and implement plans to effect the transition of the real-time capability to the operational system. Transition path is through Degraded Visual Environment-Mitigation (DVE-M) Science and Technology Objective (STO). 

REFERENCES: 

1: https://en.wikipedia.org/wiki/TERCOM

2:  G. Titi, D. Goshi, and G. Subramanian, "The Multi-Function RF (MFRF) Dataset: A W-Band Database for Degraded Visual Environment Studies", SPIE D&C, April 2016.

3:  Rosen, P. A.

4:  Hensley, S.

5:  Joughin, I. R.

6:  Li, F. K.

7:  Madsen, S. N.

8:  Rodriquez, E. & Goldstein, R. M. Synthetic Aperture Radar Interferometry Proc. IEEE, 2000, 88, 333-382

KEYWORDS: GPS-Denied Navigation, Terrain Contour Matching, Radar Image-Based Navigation 

CONTACT(S): 

Ariel Ibrahim 

(443) 861-0053 

ariel.r.ibrahim.civ@mail.mil 

Mr. Kenneth Gilliard 

(443) 861-0529 

US Flag An Official Website of the United States Government