You are here
SBIR Phase II:A Novel Human Machine Interface for Assistive Robots
Phone: (617) 877-2587
Phone: (617) 877-2587
The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase II project seeks to benefit more than 200 million people around the globe who are currently living with limb loss or impairment. With the rapid growth of an aging population and longer life expectancies, assistive technologies that can improve the independence and self-sufficiency of people, enabling them stay in their homes longer, are urgently needed. The proposed wearable sensor will be a step towards making robots designed to assist in activities of daily living more effective, affordable, and easy to use. In addition to empowering people to achieve higher levels of functionality and quality of life, this sensor may also further the fundamental understanding of physiological changes as manifested in hemodynamic patterns, which could be used to better monitor patient status and allow clinicians, as well as assistive device manufacturers, to develop more personalized and mindful solutions. _x000D_
This Small Business Innovation Research (SBIR) Phase II project aims to develop a compact and low-cost optical sensor for detecting gesture commands from disabled users and to translate the gestures to assistive robots. The human-machine interfaces currently adopted by most assistive robots are expensive and inherently noisy, requiring extensive processing and user training. A more practical, intuitive, and reliable solution is needed to better accommodate the diverse and often evolving conditions of end users. This research will focus on enhancing the reliability, usability, and compatibility of the sensor as an embeddable component for wearable assistive robots. Sensor modules that can be daisy-chained together in various arrangements will be designed to optimally monitor different muscle activities on the arm. Advanced signal processing and machine learning techniques will be used to expand the existing gesture detection algorithm and achieve more robust performance during daily device usage, addressing practical issues such as detecting multiple commands simultaneously and enabling long-term algorithm learning._x000D_
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
* Information listed above is at the time of submission. *