Workflow Guidance using XR and Gesture-Driven User Interaction
Phone: (401) 274-3064
Phone: (401) 345-3320
Access to up-to-date information, with contextual relevance, is critical to advancing the mission objectives for human exploration in space. Crew efficiency and safety depends on the associated workflows.
The proposed project aims to develop a new user experience (UX) for information visualization, with a custom user interface (UI), navigated via micromovement sensors. This innovation allows crew members to access visual data, even while their hands are occupied during a maintenance/support task.
This proprietary control functionality (Pison device) will be combined with commercially-available X-R (Augmented, Hybrid, and Virtual Reality) technology. The Pison device utilizes neuromuscular and Inertial Measurement Unit (IMU) inputs, which are translated into a UI control signal. The UI will be displayed on smart glasses, which allows for navigation of procedures, while visual cues and unobtrusive non-visual notifications are also presented to the user.
This system will be tested in a simulated environment, on the ground, to ensure acceptability for end-users. The proposed proof-of-concept trial will incorporate the display/navigation of step-by-step instructions, response to notifications, and assembly of a physical object. The user will maintain visual access to their surroundings, as well as the procedural instructions. In addition, the system can query the user’s attention through notifications (auditory/haptic), which the user will acknowledge using the Pison device. Subjective and objective measures of task performance and efficiency will be collected in order to refine the hardware design. This will allow for future integration of the technology into standard mission workflow and on-board systems (Phase II).
* Information listed above is at the time of submission. *