Video based autonomous navigation

Award Information
Agency: Department of Defense
Branch: Navy
Contract: N00014-03-C-0463
Agency Tracking Number: N021-0321
Amount: $0.00
Phase: Phase I
Program: SBIR
Awards Year: 2003
Solitcitation Year: N/A
Solitcitation Topic Code: N/A
Solitcitation Number: N/A
Small Business Information
4030 Spencer St, Suite 108, Torrance, CA, 90503
Duns: 625511050
Hubzone Owned: N
Woman Owned: N
Socially and Economically Disadvantaged: N
Principal Investigator
 Nahum Gat
 (310) 371-4445
Business Contact
 Linda Papermaster
Title: Vice President & CFO
Phone: (310) 371-4445
Research Institution
Several video-based autonomous navigation algorithms were developed under Phase-I with increasing degree of accuracy. A flight sensor package was assembled, containing a video camera, a thermal infrared camera, GPS, an Attitude and Heading ReferenceSystems, and other sensors, and flown over urban and agricultural terrain in California. Ikonos satellite imagery for the same areas were obtained. An automated video-to-satellite image registration technique was demonstrated, capable of extractingaircraft position (longitude, latitude, and altitude) and orientation (roll, pitch and yaw). During Phase-I the data analysis was implemented post flight.The Phase-II objectives are: (i) to design and assemble a small embedded microprocessor system, using modular low cost commercial components, capable of executing the navigation algorithm in real time, (ii)to further improve the accuracy of the navigationalgorithms, (iii) to tests and demonstrate the system's real-time navigation ability on a manned aircraft, and (iv) to design a custom embedded microprocessor system that meets the weight, size, and power constraints of a SWARM-class UAV, capable ofmanaging all platform functions including navigation, communications, and housekeeping functions. This system will transition the technology to the Navy's SWARM project and will serve as the primary intelligent processor for managing the UAV activitiesand functions. The basis to the video-based autonomous navigation is the ability to extract aircraft position and attitude parameters. These critical parameters are needed for orthorectification, and georegistration of aerial imagery. OKSI is involvedin aerial multi- and hyper-spectral remote sensing operations for precision farming. Precision farming is based on site-specific application of chemicals when and where needed and at the correct amount. The data products produced by OKSI are used by thefarmers for variable rate applications of chemicals to their crops. Data production requires that the remote sensing imagery is accurate to within a single pixel. The algorithms developed under the present ONR SBIR are already helping OKSI to improve itsdata accuracy, and to eliminate an intensive manual effort of georegistering the remote sensing imagery. Full automation of remote sensing data production will reduce the cost of the products and allow to expand precision farming operations to a widercustomer base.

* information listed above is at the time of submission.

Agency Micro-sites

US Flag An Official Website of the United States Government