You are here

Incremental Learning for Robot Sensing and Control

Award Information
Agency: Department of Defense
Branch: Army
Contract: W56HZV-10-C-0027
Agency Tracking Number: A09A-030-0090
Amount: $100,000.00
Phase: Phase I
Program: STTR
Solicitation Topic Code: A09A-T030
Solicitation Number: 2009.A
Solicitation Year: 2009
Award Year: 2010
Award Start Date (Proposal Award Date): 2010-01-04
Award End Date (Contract End Date): 2010-07-04
Small Business Information
281 State Highway 79
Morganville, NJ 07751
United States
DUNS: 126637755
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Urs Muller
 (732) 970-1441
Business Contact
 Paula Muller
Title: Vice President
Phone: (732) 970-1441
Research Institution
 New York University
 Yann Lecun
715 Broadway
New York, NY 10004
United States

 (212) 998-3283
 Nonprofit College or University

This proposal addresses key open challenges identified during the LAGR program for the practical use of adaptive, vision-based robot navigation in commercial settings. First, the adaptive vision system learns quickly, but forgets as quickly. This will be addressed by using an ensemble of "expert" classifiers, each of which specializes for a particular environment and can be quickly activated when the environment matches its domain of validity. Second, a new type of cost map will be used which accumulates high-level feature vectors, rather than traversability values. A global cost map will also be integrated. Third, we will pre-train the convolutional net feature extractor using the latest unsupervised algorithms for learning hierarchies of invariant features. Fourth, the limited power of general-purpose CPUs will be lifted by using a highly compact, dedicated FPGA-based hardware platform to run computationally intensive parts of the system. Implementations on commercially available GPUs will also be explored. Finally, to achieve portability and modularity, we will make our implementation independent of a particular robot platform and support a wide range of sensor types including stereo cameras and LIDAR. The result will be a highly-compact, low-power, self-contained, low-cost, vision-based navigation system for autonomous mobile robots.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government