Integrated Visual and Inertial Sensor Information for Onboard Navigation (I-VISION)

Award Information
Agency:
Department of Defense
Branch
Air Force
Amount:
$499,912.00
Award Year:
2008
Program:
STTR
Phase:
Phase II
Contract:
FA9550-08-C-0032
Award Id:
78113
Agency Tracking Number:
F064-032-0116
Solicitation Year:
n/a
Solicitation Topic Code:
n/a
Solicitation Number:
n/a
Small Business Information
9950 Wakeman Drive, Manassas, VA, 20110
Hubzone Owned:
N
Minority Owned:
N
Woman Owned:
N
Duns:
604717165
Principal Investigator:
JamesPaduano
Principal Investigator
(617) 225-4378
jpaduano@aurora.aero
Business Contact:
DianaEichfeld
Contracts Manager
(703) 396-6329
deichfeld@aurora.aero
Research Institute:
MASSACHUSETTS INST. OF TECHNOLOGY
Michael P Corcoran
77 Massachusetts Avenue
Room E19-750
Cambridge, MA, 2139
(617) 253-3906
Nonprofit college or university
Abstract
Given the vulnerability of the GPS system to intentional or hostile denial of service, it is imperative that the U.S. military develop technologies that provide robust navigation solutions that do not rely on GPS data. To achieve this capability, Aurora and MIT have teamed to develop the Integrated Visual and Inertial Sensor Information Onboard Navigation (I-VISION) system, which will couple optic flow-based ego-motion estimation with inertial navigation systems to achieve precision navigation without GPS. Current-generation inertial measurement units (IMUs) provide very accurate angular rate and linear acceleration information, which can be integrated to estimate a platform's position, velocity, and orientation over time. Unfortunately, due to sensor biases and scale factor errors, the accuracy of the estimation degrades with time. The coupling of visual flow and IMU-based ego-motion estimation enables the IMU to aid in resolving vision-based errors and ambiguities, and visual flow estimates can be used to reduce IMU sensor biases and scale factor errors. A multiple-camera system is being employed in order to better address the ambiguity that is introduced in the estimation of optical flow-based ego-motion. Both tightly coupled and ultra-tightly coupled implementations are being considered.

* information listed above is at the time of submission.

Agency Micro-sites


SBA logo

Department of Agriculture logo

Department of Commerce logo

Department of Defense logo

Department of Education logo

Department of Energy logo

Department of Health and Human Services logo

Department of Homeland Security logo

Department of Transportation logo

Enviromental Protection Agency logo

National Aeronautics and Space Administration logo

National Science Foundation logo
US Flag An Official Website of the United States Government