Perception Engine for Activity Recognition and Logging
Ten of thousands of hours of video footage already exist and countless more hours will be logged as spacecraft continue to orbit the Earth and explore the solar system. These video logs contain immeasurable amounts of useful data on crew social interactions, crew task performance, and crew-vehicle interaction. Currently, these videos must be searched and indexed by hand. This is a long process that involves many man hours of labor.Automated video processing techniques can integrated into a comprehensive toolbox that drastically reduces the time to search and analyze videos. This would allow specific regions in a video stream to be isolated for monitoring, which can provide quick indexing for human viewing of all motion-based activity in the area of a vehicle. It could also allow the user to query for specific activities or events that occurred in this region. These could be automatically detected by software and presented directly to the user.In support of NASA's needs, we propose to design a system that detects and tracks humans, human activity, human-station interaction, and team interactions using existing cameras and videos. Our overall objectives can be achieved by developing a suite of algorithms that can handleseveral key sub-challenges: 1) Robustly handling unconstrained video content and capture conditions; 2) Extracting functional descriptions of complex human events; 3) Handling ad hoc event queries effectively; 4) Operating efficiently, so the system can keep up with the flood of videos being added to current databases and provide effective interactive search over such databases.
Small Business Information at Submission:
100 North East Loop 410 Suite 520 San Antonio, TX 78216-6363
Number of Employees: