You are here

Object Cueing Using Biomimetic Approaches to Visual Information Processing

Award Information
Agency: Department of Defense
Branch: Navy
Contract: N68335-14-C-0332
Agency Tracking Number: N14A-008-0162
Amount: $79,922.00
Phase: Phase I
Program: STTR
Solicitation Topic Code: N14A-T008
Solicitation Number: 2014.A
Timeline
Solicitation Year: 2014
Award Year: 2014
Award Start Date (Proposal Award Date): 2014-09-09
Award End Date (Contract End Date): 2015-04-09
Small Business Information
3600 Green Court Suite 600
Ann Arbor, MI -
United States
DUNS: 009485124
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Jeremiah Folsom-Kovarik
 Research Scientist
 (407) 542-7830
 jeremiah.folsom-kovarik@soartech.com
Business Contact
 Andrew Dallas
Title: Vice President
Phone: (734) 887-7603
Email: contracts@soartech.com
Research Institution
 UCF
 Matt Cronan
 
4000 Central Florida Blvd
Orlando, FL 32816-
United States

 (407) 823-3031
 Nonprofit College or University
Abstract

Understanding imagery from unmanned automated systems in a timely fashion requires support systems for end users that can filter and preprocess massive data via complex vision and understanding. A new computer system that mimics both the bottom-up biological and top-down cognitive processes of the human visual system will provide breakthrough decision support for immediate imagery analysis. We propose FOCVS (pronounced"focus"), the Full context Onboard Cognitive Visual System. In FOCVS, top-down cognitive perception based on context such as user goals and mission parameters will direct low-level visual components to search efficiently by defining areas of attention, target features, and desired level of detail. Bottom-up processing will iteratively search the live data in real time, processing scenes in the full context of multiple visual and nonvisual inputs through unified processing of space-centered vision. Finally, the cognitive component will interpret reports from the neural component and apply intelligent filtering to create an actionable understanding of the environment for the human user. The result of our research will be a decision aid that adapts its execution based on the goals of a user who is in direct contact with the UAS and the mission, enabling timely and responsive first-pass imagery understanding.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government