Description:
Scope Title:
Extractable High-Resolution Terrain Database System
Scope Description:
The system would provide an extractable, high-resolution terrain database (<1 meter resolution) with all the correct metadata that is created from digital elevation terrain data, 3D rock models, 3D human-made structure models, photos, lidar scans, etc., that can be used with the most used game/scene rendering engines at NASA (Unreal, Omniverse, Unity, or Edge) to support the creation of highly immersive and highly performant simulation environments. The system should support large areas of interest >90 km, be able to ingest and store all the data needed to create the desired high-resolution/performant simulation environment, and output terrain data files at desired levels of detail, which can be used within the game/scene rendering engines mentioned above. The initial regions of interest are possible future NASA lunar landing sites, but the concept/system should be usable for Mars or other Earth locations of interest. The system should also provide a high level of automation that reduces the overall manual effort that is currently required to build these types of systems. Graphical user interfaces (GUIs) should be part of the system to facilitate the use of the system. This capability can be used to create an immersive environment to support training, collaboration, analysis, planning, and real-time operations of future Artemis missions.
Expected TRL or TRL Range at completion of the Project: 3 to 6
Primary Technology Taxonomy:
- Level 1 11 Software, Modeling, Simulation, and Information Processing
- Level 2 11.X Other Software, Modeling, Simulation, and Information Processing
Desired Deliverables of Phase I and Phase II:
- Research
- Analysis
- Prototype
- Hardware
- Software
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (Technology Readiness Level (TRL) 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art (SOA). Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
Currently, the development of the products being requested from the system requires extensive manual, time-consuming steps that can be difficult to execute. The process that is typically followed requires users to search for all the data required to create the models. The data can include Digital Elevation Model (DEM), rock models, human-made structures, other features of interest, etc. Next, the developer manually adds metadata to the different models. The metadata can include geo-reference information, the size of the object, and any other features deemed important. Next, handcrafting is performed to assure that any digital elevation data models from all the sources are sized appropriately, color corrected, and inserted into the initial terrain models created from the DEM. Additional handcrafting is performed for certain models to assure that they have the resolution/fidelity required. Upon the integration of all the data sources, further handcrafting is required to assure that the system has the necessary multiresolution model features so that it can be rendered at the necessary frame rates. This is typically done by creating models at multiple resolutions. High-resolution models are used for areas near to the user and lower resolution models are used for regions further away from the user. As the user moves around, new high-resolution versions of the models are brought into the scene for the new area where the user is located and the high-resolution models for the area that the user just left is swapped out for lower resolution versions. This swapping of models is sometimes required to allow for the system to render the scene at the required frame rates.
The system proposed would include a central storage location where data can be retrieved from for the creation of the models. This central storage location would facilitate the integration of the data. The system would also automate many of the manual and time-consuming steps that are currently required. New methods that create higher fidelity models using photogrammetry or other model creation methods could also be integrated into the system.
Current approaches NASA is using to develop the necessary 3D high-resolution models are time consuming and difficult to follow. As NASA continues to develop simulations for use on future missions, these capabilities will become more important. Having access to a system that can overcome some of the challenges will be increasingly more important.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD), Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://www.nasa.gov/specials/artemis/
This type of capability would enable the development of immersive systems that could support planning, analysis, training, and collaborative activities related to surface navigation for Artemis missions. Earth Science could also benefit from this type of capability by allowing systems to be developed that can support vegetation dispersion, human interaction with the environment, etc.
References:
https://link.springer.com/referenceworkentry/10.1007/978-1-4614-8265-9_226
https://ieeexplore.ieee.org/document/609187
Scope Title:
Augmented Reality Navigation
Scope Description:
The system should provide google maps style navigation outdoors and also inside of buildings. The system will allow for AR applications to be developed that do not require QR style visual markers, while still providing highly accurate six degrees of freedom position (6DOF) (< 1 cm); as well as highly accurate altitude and attitude information. The system should be usable with smart devices (tablets, smartphones) that support both iOS and Android operating systems. The system should also support with head worn AR devices. This type of system will allow for AR applications to be developed that can be used to accurately overlay points of interest and meta-information about those points of interest. The system allows for creation of applications that can be used to carry out activities more autonomously by allowing the system to guide a user through unfamiliar facilities and through steps that are required to carry out procedures.
Expected TRL or TRL Range at completion of the Project: 4 to 6
Primary Technology Taxonomy:
- Level 1 11 Software, Modeling, Simulation, and Information Processing
- Level 2 11.X Other Software, Modeling, Simulation, and Information Processing
Desired Deliverables of Phase I and Phase II:
- Research
- Analysis
- Prototype
- Software
- Hardware
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art. Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
Industry has made significant progress developing markerless navigation technologies. These technologies are typically used on smartphones/tablets and require calibration steps for their use. A key player in the outdoor AR navigation field is the automobile industry, where navigation information can be displayed directly on the windshield or on a screen that the driver has a direct line of sight. A significant user of indoor navigation technologies includes warehouses, where people can be guided to certain locations to find items. Improvements to both the indoor and outdoor AR navigation system is important, since NASA has use cases for both indoor and outdoor AR navigation.
Current gaps that should be addressed for future systems include the overall use of the technology on head-worn devices, along with smartphone/tablets. Additionally, the accuracy of the system should be improved to allow NASA to use the capability to support indoor electronic procedure use cases that require high precision 6DOF data. How one should interact with the AR navigation systems (i.e., the GUIs and other human interface methods that users will use to interact with the system) should also be investigated further.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD), Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://www.nasa.gov/specials/artemis/
Being able to have head-up displays (HUDs) in a helmet bubble, head-mounted displays (HMDs), or windshields that provide navigation cues to locations of interest or augment those locations with additional information will be important in the future design of next generation vehicles and suits. Furthermore, navigation aids will augment an astronaut's ability to carry out medical procedures more autonomously. It will also allow for certain procedures to be carried out that would not otherwise be possible by providing instructions on the exact placement and movement of medical instruments. Any system that reduces risks, improves operations, and allows for more autonomous operations are important for many different NASA directorates that includes ESDMD, SOMD, STMD, and SDM. Artemis and Gateway programs will also be able to infuse these technologies into future missions.
References:
https://mobidev.biz/blog/augmented-reality-indoor-navigation-app-developement-arkit
https://www.researchgate.net/publication/342383348_Augmented_Reality_Navigation
Scope Title:
Metaverse/Digital Twin
Scope Description:
The popularity of the metaverse has continued to grow with companies hailing it as the immersive visualization system of the future. Many of these companies are investing billions of dollars towards its development. Although many people have different definitions of what the metaverse is, the fundamental idea is that it provides a shared, multiuser, persistent, and highly immersive environment. This environment can be used for people to collaborate, to carry out training, to carry out design activities, to host entertainment activities, etc. These are activities that are important to NASA. An important component of the metaverse is a digital twin. Digital twins are a digital representation of a physical system that mimic the actual systems for its lifecycle. The twin receives real-time telemetry to stay current, provides situational awareness, and uses simulation, machine learning, and model-based reasoning to predict future outcomes and help decision making. Digital twins are sometimes referred to as the "building blocks" for the metaverse.
The scope of this focus area is to develop an XR architecture and applications that will enable easy access and collaboration of digital content within a metaverse/digital twin environment. The system developed should investigate the value added by a metaverse/digital twin environment to improve training, real-time operations support, collaboration, data visualization and analysis, and predictive analytics. The system should also optimize the human interfaces (GUIs and input devices) so that interaction between people and between people, facilities, instruments, etc., is optimized.
Expected TRL or TRL Range at completion of the Project: 3 to 6
Primary Technology Taxonomy:
- Level 1 11 Software, Modeling, Simulation, and Information Processing
- Level 2 11.X Other Software, Modeling, Simulation, and Information Processing
Desired Deliverables of Phase I and Phase II:
- Research
- Analysis
- Prototype
- Hardware
- Software
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art. Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
Many organizations have jumped on the metaverse/digital twin bandwagon. Although there are many definitions to what the metaverse means, most will agree that it is an immersive environment (AR or VR), that is persistent, online, and multiuser. Currently, metaverse technologies are being driven by companies that have specialized in gaming, entertainment, or social networking; there are many engineering and science applications to the technologies. Some examples of the current state of the art includes:
- Earth 2.0 Project that pairs a data-rich Earth digital twin simulation with a highly immersive visualization application to carry out climate/weather forecasting.
- Industry’s use of digital twins to design and operate their next-generation buildings and warehouses. Warehouses use digital twins of the facility and all the operators in the building (robots, people, other systems) to improve operations. These digital twins are also used to carry out predictive analytics and to train autonomous robots how to operate in a physical environment.
- Epic games Fortnite system has been used to host concerts and other events attended by tens of millions of people. The system has allowed for limited interaction to take place by a large number of people in a purely digital environment.
- Roblox system that allows users to create custom worlds that can then be linked. This further demonstrates how content can be created and linked into a metaverse that is then visited by a large number of people.
- NVIDIAs omniverse platform that allows for applications to seamlessly communicate with each other to create highly immersive experiences.
The following are the challenges/gaps that should be addressed during this effort:
- The need to improve/optimize the human interfaces needed for the interaction between the people and the digital/physical environments. This includes both the input devices and the display devices.
- The computation needed for the metaverse/digital twin environment and where this computation would take place (cloud vs. edge).
- The IT security requirements to run the distributed, multilocation, and multiuser environments.
- The need for platform interoperability. The system should be device diagnostic and be able to run on an assortment of devices.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD), Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://www.nasa.gov/specials/artemis/
Metaverse/digital twin technologies are being used in industry to reduce risks/costs, improve operations/training/collaboration, support education/outreach, etc. The improvements provided in these areas would also provide value added to many NASA Programs/Directorates that include Aeronautics, Human Exploration, Science, Space Technology, Artemis, Gateway, etc.
References:
https://en.wikipedia.org/wiki/Metaverse