Scene Understanding for Semi-Autonomous Navigation (SUSAN)
The Army has a clear need for a small mobile robot capable of accompanying a single soldier. Such a robot would help solve both logistical problems of individual soldiers needing to transport more equipment and supplies than they can carry in a backpack, and tactical problems of scouting unsafe areas. Multiple designs for such robots exist; however, a common problem for those robots is that they need remote control or teleoperation. This makes it very difficult for the soldier to do anything else at the same time, such as look out for threats, or even just walk without tripping. A less cognitively demanding form of control is clearly needed that will allow the soldier to control the robot (or robots) and still perform his primary combat mission. We propose a Scene Understanding for Semi-Autonomous Navigation (SUSAN) system for the semi-supervised control of unmanned ground vehicles. The system allows a user to direct a vehicle from behind, using a touch-screen device, or from in front, by leading the way and issuing commands through arm/hand gestures. SUSAN achieves this using novel monocular scene understanding algorithms and robust tracking techniques, and does not require specialized robot-mounted sensors or soldier-worn markers.
Small Business Information at Submission:
Charles River Analytics Inc.
625 Mount Auburn Street Cambridge, MA -
Number of Employees: