Non-intrusive Multimodal Emotional State Monitoring
Agency / Branch:
DOD / DARPA
This SBIR Phase I proposal will conduct an exploratory study to establish the feasibility of a multimodal framework for monitoring the emotion of a person using an non-intrusive sensor suite that includes video, audio, and thermal sensing. This will lead to the development of a portable gauge for robustly monitoring the emotional state of people in operational environments. The following five dynamic features will be considered in the multimodal framework: (a) facial expression, (b) facial temperature distribution, (c) facial perspiration pattern, (c) head movement, (d) hand gesture, and (e) speech prosody. It is expected that the multimodal features will give greater visibility and a more accurate estimate of the emotional state than the observation of any one feature, especially in situations involving suppression of expression. The proposed research will be carried out by a multidisciplinary team from Advanced Interfaces, Inc., Artis, LLC, and the University of Pittsburgh.
Small Business Information at Submission:
ADVANCED INTERFACES, INC.
403 S. Allen Street, Suite 104 State College, PA 16801
Number of Employees: