Building a coherent world view from sensory data
Department of Defense
Defense Advanced Research Projects Agency
Agency Tracking Number:
Solicitation Topic Code:
Small Business Information
100 Northeast Loop 410, Suite 520, San Antonio, TX, 77216
Socially and Economically Disadvantaged:
AbstractSensors are proliferating in military applications. From unmanned air vehicles (UAVs) to unmanned ground vehicles (UGVs) to satellites the amount of video, infrared, range and other data is growing exponentially. These sensors are often stovepiped into their own processing streams with much of the final processing done manually. This proposal aims to create a general purpose sensor-to-symbol architecture that merges all sensor information into a coherent world view that is maintained over time. A sensor-to-symbol architecture regularizes the connections between the sensed physical world and the symbols that represent that world. Our approach consists of representing sensors in a domain-independent language, representing the world in an object ontology and developing a reasoning substrate that connects the two. The reasoning substrate is based on a computational cognitive architecture called Polyscheme. Human stakeholders are tied into the sensor-to-symbol architecture via personal agents that manage the stream of information. The approach will be evaluated using mobile robots that are searching for specific objects in the environment.
* information listed above is at the time of submission.