You are here

Low Cost Cockpit head tracking and gestural recognition


OBJECTIVE: Develop a system to track pilot head, hand, and arm positions and movements in a rotorcraft cockpit using advanced human-machine interface technology like those used in gaming systems to identify gestures, movements, and head tracking (approximate eye aim-point). DESCRIPTION: Current Head Position Sensing systems, like those on Helmet Mounted Displays like the Apache IHADSS, pose a significant challenge and so tend to be a rather high cost component to install and maintain. HMD designs must sense the elevation, azimuth and tilt of the pilot's head relative to the airframe with high precision even under high"g"maneuvers and during rapid head movement. Two basic methods are used in current HMD technology - optical and electromagnetic. Optical systems employ infrared emitters on the helmet (or cockpit) and infrared detectors in the cockpit (or helmet), to measure the pilot's head position. The main limitations are restricted fields of regard and sensitivity to sunlight or other heat sources. Electromagnetic sensing designs use coils (in the helmet) placed in an alternating field (generated in the cockpit) to produce alternating electrical voltages based on the movement of the helmet in multiple axes. This technique requires precise magnetic mapping of the cockpit to account for ferrous and conductive materials in the seat, cockpit sills, and canopy to reduce angular errors in the measurement. Current aviation HMD designs use the pilot's eye aimpoint (actually head angle) as a pointing device to give aircrew the ability to target nearly any point in the environment seen by the pilot. These systems allow targets to be designated with minimal aircraft maneuvering, minimizing the time spent in the threat environment, and allowing greater lethality, survivability, and pilot situational awareness.(1) New technology from the gaming world has the potential to substantially reduce the cost of adding head tracking to conventional helicopters, as well as the ability to do body tracking and gesture recognition to support future intelligent cockpits. In Nov 2010 Microsoft released the Kinect for Xbox, and it became a sensation holding the Guinness World Record for being the"fastest selling consumer electronics device". The Kinect provides 3 basic capabilities: advanced gesture recognition, facial recognition and voice recognition. Soon after its release, open source drivers for the Kinect were released which spurred an avalanche of application development by third party developers. Applications include 3D mapping, browser control, motion controllers, 3D teleconferencing, and basic visual SLAM (simultaneous localization and mapping). The Kinect has a range limit of 1.23.5 m (3.911 ft) and an angular field of view of 57 degrees horizontally and 43 degrees vertically. The Kinect can simultaneously track up to six people, including two active players for motion analysis with a feature extraction of 20 joints per player. The depth sensor consists of an infrared laser projector combined with a monochrome CMOS sensor, which captures video data in 3D under any ambient light conditions. Similar technologies are being developed by other game system developers (PS2 Eye Toy, etc.) and by other companies with similar application (head tracking and gesture recognition) and are indeed applicable to this effort. Head tracking, along with gesture recognition, has the potential of being an integral part of future advanced/intelligent cockpit technologies enabling abilities like helmet/head/face tracking, virtual controls and displays, pilot physical status assessment (consciousness/fatigue/tunneling/injury), cockpit damage, and identifying objects/areas of interest internal and external to the cockpit using head tracking, On an aircraft its potential applications include both cockpit and passenger cabin, with the main area of interest being the cockpit. The ability to track a pilot's head to determine what he is looking at is one of the main tasks. This can be used to estimate current object of interest, help identify when a pilot is becoming overly focused on a single display or control (cognitive tunneling), and identify areas/locations of interest to the pilot, for example, when reacting to threats. The primary focus for this effort will be to determine the feasibility of using gaming or other low cost technologies like the Kinect in a modern cockpit and adapting it into an application suitable for Army helicopter cockpits and other aviation systems. Key questions to be answered by this effort include: 1) can the gaming technology be adapted to work within the physical environment of an aviation cockpit and/or cargo bay; 2) What is the impact on the overall cockpit (electromagnetic interference (EMI), SWAP (space, weight, and power), avionics system integration, mounting, reliability, impact on other systems like night vision goggles, system airworthiness, etc.); 3) what modifications to the system are needed to make it applicable to Army aviation; and, 4) what is the overall performance and accuracy of system in head tracking, motion/body tracking, face recognition, etc. Other issues to resolve are how many systems and how best to arrange them to support different cockpit configurations and the ability of the system to self calibrate and compensate for variation in the cockpit environment. Ultimately the feasibility of such a system needs to be verified in a true cockpit environment. PHASE I: Assess the feasibility of using head and gesture tracking gaming technology in a cockpit environment, to include assessing body mapping accuracies, fidelity of gesture recognition, and overall system ruggedness. Develop a concept for integrating a system in a variety of cockpit configurations (side-by-side and tandem). Conduct proof of concept testing for key subsystems to validate that a viable system can be integrated into a cockpit. PHASE II: Develop software to do body tracking and gesture recognition. Build a prototype system to track body movement (especially head) and recognize various mission relevant gestures and support human machine interactions in a cockpit mock up. Conduct testing to assess the software's ability to determine what the pilot is doing, what controls and systems he is interacting with, and what is his primary interest (specific screen of control inside the aircraft or where outside the vehicle he is looking) in real-time. The offeror shall integrate a breadboard system into a surrogate commercial cockpit and demonstrate its functionality in a flight. PHASE III: In follow-on research the offeror needs to work with an army rotorcraft manufacturers to integrate the system into a army cockpit. Additional efforts will also be required to integrate this technology with cockpit software and interfaces that can utilize the information from the system to interact better with the pilot. This system would have key applications to both commercial and military cockpits with greatest impact on those systems that do not have head tracking capabilities like the Blackhawk, Kiowa Warrior, and Chinook. Besides being applicable to aircraft cockpits, this system would also have application to almost all ground vehicles, C2 Vehicles, even fixed workstations: any work station where an operator is interacting with displays and control. Commercial application for such a ruggedized system are nearly endless and would include: monitoring vehicles and facilities for home land security and industry at large; a variety of automotive, trucking, commercial airline, etc. for monitoring operator status; and support aiding systems, and monitoring safety in shops, hangers, construction sites, etc.
US Flag An Official Website of the United States Government