You are here
Context-driven Active-sensing for Repair Tasks II (CART II)
Phone: (978) 372-7635
Email: paulr@dollabs.com
Phone: (978) 372-7635
Email: paulr@dollabs.com
Contact: Laureen Horton
Address:
Phone: (617) 253-3922
Type: Nonprofit College or University
Existing machine perception systems are too inflexible, and are not robust enough to environmental uncertainty. In existing systems, perception components are statically (and manually) configured to process sensor data. The parameters of components in such a system are also statically tuned to operate optimally under very specific conditions. Information flow in such systems is bottom up, and generally not guided by knowledge of higher level context and goals. We have the opportunity to provide architectural support for context driven computer vision applications that draw robustness from context and from a closed loop approach to perception. Rather than treating image interpretation as a collection of carefully tuned image processing capabilities, we can consider the context in which our sensors provide meaningful and useful feedback and we can leverage the context of a known task to drive the perceptual process. This approach will provide a foundation for building a multitude of image processing applications that are difficult or impossible to build using conventional bottom up techniques.
* Information listed above is at the time of submission. *