You are here

Extended Reality (Augmented Reality, Virtual Reality, Mixed Reality, and Hybrid Reality)

Description:

Lead Center: JSC        

Participating Center(s): GSFC, KSC        

 

Scope Title: Extended Reality (XR) Extravehicular Activity (EVA) Surface Operations and Training Technologies

 

Scope Description:

Future NASA lunar missions will last much longer, be more complex, and face more challenges and hazards than were faced during the Apollo missions. These new missions will require that astronauts have the very best training and real-time operations support tools possible because a single error during task execution can have dire consequences in the hazardous lunar environment.

Training for lunar surface EVA during the Apollo era required the use of physical models in labs, large hangars, or outdoor facilities. These support modalities had inherent detractors such as the background environments that included observers, trainers, cameras, and other objects. These detractors reduced the immersiveness and overall efficacy of the system. Studies show that the more “real” a training environment is, the better the training is received. This is because realism improves “muscle memory,” which is critically important, especially in hazardous environments. XR systems can be made that mitigate the distractors posed by observers, trainers, background visuals, etc., which was not possible in Apollo-era environments. The virtual environments that can be created are so “lifelike” that it can be extremely difficult to determine when someone is looking at a photograph of a real environment or a screen captured from a digitally created scene. XR systems also allow for training to take place that is typically too dangerous (e.g., evacuation scenarios that include fire, smoke, or other dangerous chemicals), too costly (buildup of an entire habitat environment with all their subsystems), not physically possible (e.g., incorporation of large-scale environments in a simulated lunar/Mars environment), and a system that is easily and much more cost effective to reconfigure for different mission scenarios (i.e., it is easier, quicker, and less expensive to modify digital content than to create or modify physical mockups or other physical components).

The objective of this subtopic is to develop, and mature XR technologies related to EVA activities being used for lunar and subsequent Mars surface operations. NASA’s current plans are to have boots on the surface of the Moon in late 2024. The initial lunar missions will be short in duration and provide limited objectives related to science and exploration and instead focus on the checkout of core vehicle systems. Current XR capabilities will provide support for these missions, but the scope of this subtopic will focus on the technologies that can support subsequent missions where the mission duration is longer and where science, exploration, and lunar infrastructure development are higher in priority.

The three key technology areas of interest for this subtopic include:

  • A comprehensive hyperrealistic XR real-time visualization system that includes multiresolution terrain, where any location astronauts carry out activities would have highly detailed resolution terrain (centimeter or lower resolution), and areas where astronauts will not carry out activities, will have adequate resolution to provide the appropriate contextual situational awareness. Also, the system should have photorealistic and interactive representative geological features (e.g., rocks, soil, cliff faces, lava tubes, etc.) incorporated, photorealistic avatars of astronauts wearing representative space suits that are properly rigged for motion capture/animation, and the assets needed to carry out the missions in the environment (e.g. habitats, landers, rovers, instruments, tools, etc.). Furthermore, the system should allow observers to join the digital environment virtually from a remote location and be able to "tie" their viewpoint to the astronaut's viewpoint or to any location in the scene. The appropriate physics should be adhered to by all the content in the environment.
  • High-precision, reliable tracking—This includes multiple-room-based tracking that can provide the geolocation (and object registration) of the physical objects being used. The system must be able to track physical objects that may be part of a larger system (e.g., instruments on a rack) and thus have the ability to overcome limited line-of-sight issues with the external space. Also, tracking of the hands/fingers accurately and reliably is important.
  • The system should allow for a real-time two-person redirected walking capability that allows two individuals to walk around in a very large virtual reality (VR) digitally created terrain environment, while physically present in a small conference-room-sized environment. The system should also allow the astronauts to walk around the environment without colliding.

 

Although the context of the technologies listed are focused on their use for lunar and subsequent Mars surface EVA activities, these technologies are crosscutting in nature and have applications in many other areas across NASA.

Expected TRL or TRL Range at completion of the Project: 3 to 6

 

Primary Technology Taxonomy:
Level 1: TX 11 Software, Modeling, Simulation, and Information Processing
Level 2: TX 11.3 Simulation

 

Desired Deliverables of Phase I and Phase II:

  • Research
  • Analysis
  • Prototype
  • Hardware
  • Software
     

Desired Deliverables Description:

Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).

As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art (SOA). Software implementation of the developed solution along with the simulation platform must be included as a deliverable.

State of the Art and Critical Gaps:

Video game programmers and computer modeling artists are currently leading the industry in SOA for hyperrealistics, real-time VR environment development. Applying these concepts, along with human-computer interface methods to areas outside of the video game industry is known as “gamification.” New gamification concepts are increasing the realism, immersion, and ways that users can interact with the XR systems. Companies like NVIDIA, Microsoft, Apple, Facebook, and others are developing XR capabilities that are pushing the boundaries on what is possible in XR across the spectrum, but small companies are also making significant contributions to many areas and finding innovative solutions for XR needs and gaps. Although work has been expended in industry to address several XR challenges, there is quite a bit of work left to develop consistent, reliable, and robust solutions that address specific gaps related to the XR high-interest areas for this subtopic that include:

  • Redirected walking (RW)—RW has been implemented successfully for large physical environments for one individual in the scene. Research papers and concepts have been published that show how one could approach the development of a redirectly walking system for smaller spaces. Furthermore, there is published research related for the development of a system that can have two individuals in the scene while wearing a VR head-mounted display (HMD) and adjusting the visuals so that the individuals do not run into each other. Successfully implementation of the research and concepts is required.
  • Real-time hyperrealistic rendering of large virtual environments that includes a high level of detail terrains (appropriate details for surface operations) and object models (instruments, tools, facilities, etc.).
  • Highly accurate torso, finger, hand, and object tracking for multiple rooms, which includes tracking of objects that may have limited visibility with the exterior environment.
  • Novel human-computer interface methods.

 

Relevance / Science Traceability:

XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Human Exploration and Operations Mission Directorate (HEOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.

 

References:

  1. Before Going to the Moon, Apollo 11 Astronauts Trained at These Five Sites: From Arizona to Hawaii, these landscapes—similar in ways to the surface of the moon—were critical training grounds for the crew:  https://www.smithsonianmag.com/travel/going-moon-apollo-11-astronauts-trained-these-five-sites-180972452/
  2. NASA Tests Mixed Reality, Scientific Know-How, and Mission Operations for Exploration:  https://www.nasa.gov/feature/ames/analog-missions-mixed-reality
  3. The Past, Present and Future of XR for Space Exploration:  http://www.modsimworld.org/papers/2019/MODSIM_2019_paper_43.pdf
  4. See Photos of How Astronauts Trained for the Apollo Moon Missions:  https://www.history.com/news/moon-landing-apollo-11-training-photos
  5. How To Effectively Use XR Training In High-Risk Industries: 4 Examples:  https://roundtablelearning.com/how-to-effectively-use-xr-training-in-high-risk-industries/
  6. Training for space: Astronaut training and mission preparation:  https://www.nasa.gov/centers/johnson/pdf/160410main_space_training_fact_sheet.pdf
  7. Towards Virtual Reality Infinite Walking: Dynamic Saccadic Redirection:  https://research.nvidia.com/publication/2018-08_Towards-Virtual-Reality
  8. Virtual and Augmented Reality: 15 Years of Research on Redirected Walking in Immersive Virtual Environments:  https://www.cs.purdue.edu/cgvlab/courses/490590VR/notes/VRLocomotion/15YearsOfRedirectedWalking.pdf
  9. An Immersive Multi-User Virtual Reality for Emergency Simulation Training: Usability Study: https://www.immersivelearning.news/tag/multi-user/
US Flag An Official Website of the United States Government