You are here

Augmented Reality Training for Dismounted Soldiers

Description:

DIRECT TO PHASE II

TECHNOLOGY AREA(S): Electronics, Human Systems

OBJECTIVE: Design and fabricate an integrated Augmented Reality system for use by Dismounted Soldiers that demonstrate high levels of immersion in live indoor and outdoor environments and demonstrate future interoperability in both single and multiplayer (collective) configurations with evolving Synthetic Training Environment (STE).

DESCRIPTION: Perceived as an emerging technology of the future, Augmented Reality (AR) is making its way into Smartphones and Tablets, as next generation image capturing and Heads-Up Display (HUD) technologies mature. The US Army of 2025 and beyond requires a robust, realistic, and adaptable training capability. Augmented Reality (AR) technologies will enable the integration of synthetic simulations with live training environments. This topic seeks to integrate state-of-the-art electronics, packaging, and augmentation technologies with the latest low-power data, computing, and rendering components in a single man-wearable package.

Currently, the COTS industry has several emerging capabilities that show great promise for home and/or industrial use. These capabilities appear to have some degree of dismounted Soldier training value when combined as a wholly integrated solution. The integration of these capabilities as-is may not be sufficient, however, because of concerns of ruggedness, interference (e.g., wireless, magnetic, optical occlusion), weather resistance, and so on. The system may result in the modification of these COTS components and/or the creation of new components to address any capability gaps. Soldiers utilizing the system should experience minimal encumbrance to their existing tactical/training equipment and gear. The system should be able to support a squad-level size unit. The system should have a clear design and architecture path to scale up to a platoon level.

The DoD has a critical need for breakthrough man-wearable technologies to develop and demonstrate an advanced AR technology prototype system that demonstrate lightweight and affordable approaches which enhance the ability of live soldiers to train with virtual and live entities in live environments. The advanced AR system prototype is a system that must include real-time live/virtual bridging, correlated content, low-latency augmented reality with static / dynamic occlusion and depth sensing, indoor and outdoor operations, support all lighting conditions (dark night to bright sunlight), real-time localized haptics feedback, full weapon and existing soldier equipment integration, multimodal man-machine interfaces, and support sensing of full-body articulation to be used with virtual content interaction (equipment, avatars, etc.) and presentation to other virtual / gaming / constructive training systems within the Army’s synthetic training environment (STE) initiatives. The approach must also provide for methods to rapidly map live 3D spaces for new deployments and use in future training exercises along with natural blending of virtual content into the live display (static / dynamic lighting, shadows, etc.). The systems must also provide reliable real-time telemetry to allow for high-fidelity distributed after action review (AAR), remote monitoring and configuration, and support cloud development and content delivery strategies.

Proposals should target the design and implementation of a COTS-based man-wearable augmented reality system and it’s supporting components. Essential elements of this component include a wide field of view, wireless head mounted display (WHMD), human articulation tracking technologies, flexible direct electronic interfaces to haptics sensors, and low power pre-processing circuitry to 6-DOF pose and 3D depth sensing sensor signals into formats that can be transmitted wirelessly to after action and monitoring systems. Packaging must leverage state-of-the-art miniaturized sensors, processing, and rendering packaging that incorporates on-board wireless power reception and conditioning circuitry.

Technical challenges may include:
• The development of a wide field of view, high contrast, wireless HMD capable of providing clear mixed/augmented reality displays under indoor and outdoor conditions and in a wide variety of lighting conditions and operational spaces which a soldier can wear for long periods of time without significant eye/head fatigue.
• Maximizing the scalability and bandwidth-power product of both the on-board devices and external wireless data and power interfaces, but doing so within safe heat dissipation limits for human extended use.
• Establishing optimal trade-offs between physical, electronic, and data transmission specifications required to minimize the componentry bill of materials (BoM) and hence the size and weight of the devices mounted on the human.
• Determining optimal power-bandwidth tradeoffs and scalability to support extended training exercises using the man-wearable technologies.
• Developing enhanced virtual content capable of naturally blending into the live lighting environment
• Demonstrate the ability for multiple dismounted soldiers to train together in a common location without interference or degradation of AR sensor / wireless telemetry performance
• Providing for distributed training concepts where the immersed human seamlessly trains and interacts with live soldiers and other training system interfaces (virtual, game, constructive)
• Developing enhanced augmented reality dismounted solider training scenarios which exploit the additional capabilities associated with mixed / augment reality

PHASE I: Determine the feasibility/approach for the development of integrated augmented reality technologies to meet training requirements in support of US Army dismounted solider training initiatives within live training domain environments. The tasks include a cognitive task analysis to understand the competencies and knowledge requirements associated with dismounted training; a technology analysis to guide the application and trade off key components, approaches, and subsystems; and research conducted to evaluate the impact of augmented reality technologies on trainee understanding.

PHASE II: Development, demonstration, and delivery of a working prototype augmented reality based dismounted soldier training (full Army squad 9 man) capability that can be utilized within live domain training environments. Prototype system will need to track soldier training timelines, objectives, soldier actions taken or received by others, and provide visual/haptic cues in response to the actions taken or received. Demonstrations will be at TRL 6. Phase II deliverables include full system design and specifications to include executable and source code.

DIRECT TO PHASE II (DP2): Offerors interested in submitting a DP2 proposal in response to this topic must provide documentation to substantiate that the scientific and technical merit and feasibility described in the Phase I section of this topic has been met and describes the potential commercial applications. The offerors related DP2 proposal will not be evaluated without adequate PH I feasibility documentation. Documentation should include all relevant information including, but not limited to: technical reports, test data, prototype designs/models, and performance goals/result. Please read the OSD SBIR 16.2 Direct to Phase II Instructions.

PHASE III DUAL USE APPLICATIONS: Refine design and continue technology investigation and integration into a prototype baseline, and implement basic modeling methods, algorithms and interfaces. Pursue full integration within the Live Training Transformation (LT2) and Tactical Engagement Simulation Systems (TESS) product lines, to define an implementation solution. Continue to develop models, procedures, actions and reactions with virtual content, ensure complete traceability to dismounted soldier training requirements. Ensure product line development between live domain and virtual / gaming solutions with a target for integration into the Army’s synthetic training environment (STE) and planned training technology matrices with cloud based content and development strategies.

  • Naval Research Laboratory Washington, D.C. 20375-5320, “Advancing Human Centered Augmented Reality Research” (2004).
  • Naval Research Laboratory Washington, D.C. 20375-5320, “The Development of Mobile Augmented Reality” (2012).
  • Livingston, M., Gabbard, J., Swan II, J., Sibbley, C., & Barrow, J. (2012). “Basic Perception in Head-worn Augmented Reality Displays”, In Human Factors in Augmented Reality Environments (pp. 33-66). New York, New York: Springer.
  • (4) G. Kim, C. Perey, M. Preda, eds., “Mixed and Augmented Reality Reference Model,” ISO/IEC CD 24-29-1, July 2014.
  • Crutchfield, Richard., et al. “Live Synthetic Training, Test & Evaluation Infrastructure Architecture, A Service Oriented Architecture Approach”, MITRE Technical Report, MTR 150046, 20 February 2015
  • R. Kumar et al, “Implementation of an Augmented Reality System for Training Dismounted Warfighters,” paper No. 12149, in Interservice/Industry Training, Simulation, and Education Conf. (I/ITSEC) 2012.
  • S. You, U. Neumann, R. Azuma, “Orientation Tracking for Outdoor Augmented Reality Registration,” IEEE Computer Graphics and Applications, November/December 1999.
  • PEO-STRI, “Synthetic Training Environment (STE) Technology / Industry Day”, 1-2 September 2015

KEYWORDS: Head Mounted Display, Haptics, Augmented Reality, Human Computer Interaction, Training, Embedded Training

 

US Flag An Official Website of the United States Government