You are here

Blended Reality Solution for Live, Virtual, and Constructive Field Training


TECHNOLOGY AREA(S): Human Systems 

OBJECTIVE: Develop and evaluate learning utility of a rugged, lightweight system to provide high fidelity blended reality for outdoor ground-based Battlefield Airmen LVC training. 

DESCRIPTION: Live, virtual, and constructive (LVC) training methods have been successfully applied to tactical fast jet and Joint Terminal Attack Controller (JTAC) domains [1, 2]. The use of simulation reduces the need for support staff, live air assets, fuel, ammunition, and volunteers. By creating a virtual simulated environment, training instructors have a flexible training framework that can support a variety of training scenarios in a way that is more cost effective than live range training. Using a simulator for training works well for pilots and JTACs, as these environments can be replicated with an indoor simulator with relatively small space. However, given the broad set of techniques and procedures associated with the battlefield airmen specialties (e.g., pararescue [3]), full mission profile training is very challenging within the space limitations of a confined simulator. Further, while pilots and JTACs utilize their respective equipment to interact with the environment, many of the battlefield airmen mission sets require operators to largely interact with the physical world around them. Thus, there is a need for a solution that enables LVC concepts for outdoor, ground-based full mission profile training. Unfortunately, no such method to implement this training currently exists. Such a solution would leverage simulation methods to offset the costs associated with live training while providing the best learning and training experiences to the United States' warfighters. This STTR will evaluate approaches for the development of a blended reality solution that can be used in outdoor training. We define the blended reality solution as a system that allows trainees to simultaneously interact with both the live and virtual environments. The desired approach would track location and head orientation of the training participants within the virtual space; provide for use of a head mounted, see-through display that provides an overlay of virtual world elements, such as entities or buildings, on the live-world around the trainee and include personal audio of the blended reality environment. This approach has the advantage of injecting virtual and constructive entities while allowing trainees to fully interact with their live environment. Further, the system must be lightweight and rugged, given the military end-user. The addition of auditory stimulation is desired to provide an immersive, realistic environment for trainees. The STTR will also assess and validate the training utility of the system using data-driven learning metrics. Such metrics should be automated and unobtrusive to track training effectiveness and learner engagement. The desired system will provide a framework for simulation training for personnel recovery and other ground-based warfighters. The training that will be enabled by this technology will encompass many of the tasks of a battlefield airman, such as medical care under fire, that would be difficult or costly to perform without simulation. The system should utilize LVC protocols and standards during development to allow for future system interoperability [4]. Government furnished equipment will not be provided. 

PHASE I: Conduct a detailed analysis of existing technologies that may be utilized to create a blended reality LVC solution. Conceptualize and design an innovative blended reality solution for outdoor ground-based Battlefield Airmen LVC training. Develop an initial concept design and model key elements that will be fully developed in Phase II. 

PHASE II: Develop prototype and demonstrate the selected blended reality training solution from Phase I. Determine fidelity, robustness, and learning utility metrics and levels required to have an effective training system. Systematically collect operator feedback and evaluate system based on aforementioned metrics. Summarize technical achievements, metrics analysis, collected feedback, and performance tradeoff analysis decisions in a technical report. 

PHASE III: Refine design based on outcomes of demonstrations, tests and customer feedback in Phase II. Transition the capability to militarily useful platforms. Produce production representative prototypes. Provide user and maintainer manuals. Develop cost and schedule estimates for full rate production. 


1. Schreiber, B. T., Schroeder, M., & Bennett Jr, W. (2011). Distributed Mission Operations Within-Simulator Training Effectiveness. The International Journal of Aviation Psychology, 21(3), 254-268.

2. Reitz, E. A., & Seavey, K. (2014). Distributed Live/Virtual Environments to Improve Joint Fires Performance. Interservice/Industry Training, Simulation, and Education Conference (IITSEC), 2014.

3. AFI 16-1202, Pararescue Operations, Techniques, and Procedures, 3 May 2001.

4. IEEE Standard for Distributed Interactive Simulation Ãڬ¬œ Applications Protocols, IEEE Standard 1278.1-2012.


KEYWORDS: Live Virtual And Constructive, Blended Reality, Battlefield Airmen, Pararescue, Training 

US Flag An Official Website of the United States Government