Integrated Visual and Inertial Sensor Information for Onboard Navigation (I-VISION)

Award Information
Agency:
Department of Defense
Amount:
$499,912.00
Program:
STTR
Contract:
FA9550-08-C-0032
Solitcitation Year:
2006
Solicitation Number:
N/A
Branch:
Air Force
Award Year:
2008
Phase:
Phase II
Agency Tracking Number:
F064-032-0116
Solicitation Topic Code:
AF06-T032
Small Business Information
AURORA FLIGHT SCIENCES CORP.
9950 Wakeman Drive, Manassas, VA, 20110
Hubzone Owned:
N
Woman Owned:
N
Socially and Economically Disadvantaged:
N
Duns:
604717165
Principal Investigator
 James Paduano
 Principal Investigator
 (617) 225-4378
 jpaduano@aurora.aero
Business Contact
 Diana Eichfeld
Title: Contracts Manager
Phone: (703) 396-6329
Email: deichfeld@aurora.aero
Research Institution
 MASSACHUSETTS INST. OF TECHNOLOGY
 Michael P Corcoran
 77 Massachusetts Avenue
Room E19-750
Cambridge, MA, 2139
 (617) 253-3906
 Nonprofit college or university
Abstract
Given the vulnerability of the GPS system to intentional or hostile denial of service, it is imperative that the U.S. military develop technologies that provide robust navigation solutions that do not rely on GPS data. To achieve this capability, Aurora and MIT have teamed to develop the Integrated Visual and Inertial Sensor Information Onboard Navigation (I-VISION) system, which will couple optic flow-based ego-motion estimation with inertial navigation systems to achieve precision navigation without GPS. Current-generation inertial measurement units (IMUs) provide very accurate angular rate and linear acceleration information, which can be integrated to estimate a platform’s position, velocity, and orientation over time. Unfortunately, due to sensor biases and scale factor errors, the accuracy of the estimation degrades with time. The coupling of visual flow and IMU-based ego-motion estimation enables the IMU to aid in resolving vision-based errors and ambiguities, and visual flow estimates can be used to reduce IMU sensor biases and scale factor errors. A multiple-camera system is being employed in order to better address the ambiguity that is introduced in the estimation of optical flow-based ego-motion. Both tightly coupled and ultra-tightly coupled implementations are being considered.

* information listed above is at the time of submission.

Agency Micro-sites

US Flag An Official Website of the United States Government