You are here
Processing for Flexible Sensors
Title: President
Phone: (202) 238-9545
Email: geof@centeye.com
Title: President
Phone: (202) 238-9545
Email: geof@centeye.com
The overall goal of this project is to develop a flexible acuity sensor (FAS), including both hardware and software, and demonstrate its ability to acquire and track moving targets in the environment, including when the sensor itself is moving. These sensors will utilize a proprietary Centeye vision chip capable of acquiring images at multiple resolutions and frame rates. The sensors developed will be production ready for commercialization. We will refine and harden the sensor firmware via extensive testing in real-world environments. We will integrate the sensor onto an air vehicle and perform two demonstrations: In Year 1, we will fly the FAS on a rotary-wing MAV and demonstrate the ability of the sensor to acquire and track moving targets. In Year 2 we will insert the FAS in the control loop of the MAV to demonstrate the ability to additionally pursue and possibly collide with the target. Primary deliverables include FAS sensors, a development kit with libraries upgraded enabling the user to create their own applications, and a wide field of view multiaperture imaging system based on these sensors. BENEFITS: The technology developed will be useful for a wide range of applications in micro air vehicles (MAVs), unmanned air vehicles (UAVs), and robotics in general. The developed hardware will utilize a combined hardware and software approach that facilitates the detection and tracking of moving objects, obstacle detection, and general optical flow processing. When properly inserted onto MAVs, the resulting technology could improve obstacle avoidance adequately to enable autonomous flight down an urban canyon while avoiding both larger obstacles (buildings, full trees) and smaller obstacles (poles, cables, etc.). The resulting technology could also facilitate the detection of moving targets in the area such as persons, ground vehicles, or other air vehicles. The combined hardware and software approach will reduce demands on the image processor, allowing these capabilities to be ultimately performed with a weight budget of two grams or less per sensor. Note that our definition of "sensor" is a single board that includes optics, imager, processor, memory, and all other active or passive components enabling the board to act as a self-contained machine vision system. The same technology could be adapted for use in both ground and underwater robotics, for both civilian and military applications. Civilian applications would range from toys and consumer robotics (e.g. vacuum cleaners, etc.) to more sophisticated commercial robots. The development kit will allow an end-user to develop their own sensor firmware/algorithms and facilitate the integration of the technology into their own products or systems.
* Information listed above is at the time of submission. *