You are here

Accurate Situational Awareness using Augmented Reality Technology

Description:

DIRECT TO PHASE II

TECHNOLOGY AREA(S): Electronics, Human Systems

OBJECTIVE: To provide an enhanced, real-world experimentation and prototype capability to Soldiers that are learning to use sensors, sensor imagery, geolocation information, Situational Awareness (SA) and command and control information in new and novel ways through the use of virtual reality, augmented reality, and augmented virtuality.

DESCRIPTION: Urban combat requires full situational understanding and informed, accurate information for rapid and decisive action. Current solutions require Warfighters to look away from the battlefield at a display and manually mark items – losing Situational Awareness, accuracy and understanding. Fusion of information to displays is inefficient and ineffective, affecting rapid and decisive action by small units in their Area of Responsibility (AOR). Further, there is a lack of connectivity and sharing of information between the mounted and dismounted Warfighter.

We seek the ability to provide imagery to soldiers in the back of a vehicle, but the issues associated with that capability are unknown. For example, what level of detail is sufficient to provide accurate SA to the soldier? What update rate is required to avoid motion sickness? Does the position of the soldier in the vehicle versus the location of the display affect understanding and efficacy? What are the problems with using geo-registration? A short range camera with a wide field of view (FOV) provides accurate location; how can a long range camera provide accurate geo-registration? How can we automate DTED data and horizon matching? If current solutions use landmarks, what can be used when those are not readily available? Overall, what is the accuracy of VR/AV solutions and how can we ensure that an icon is accurately matched to a target?

We believe the issues can be addressed with a capability that provides VR/AV prototypes in the context of target acquisition experimentation, with the goal of increasing Soldier performance and familiarization with the increased SA. Experimentation could include, but is not limited to, lightweight, flexible displays or optics that can be integrated into protective eyewear or helmet-mounted displays, mobile electronics, game-based systems, intelligent tutoring, enhanced character behaviors, and the efficient use of terrain databases and models for target acquisition experimentation.

PHASE I: The offeror will survey existing capabilities and propose solutions to the issues identified with providing SA imagery to mounted and dismounted soldiers. The offeror will select a limited number of challenge areas to research, in order to create an experimental design and methodology for augmenting target acquisition performance measurement and experimentation. The phase will result in a study and report of the challenges associated with VR/AV capability, an experiment design for use in a perception testing laboratory, and a detailed research plan to execute a Phase II prototype.

PHASE II: The offeror will implement one or two tactically correct prototype capabilities demonstrating a virtual vehicle simulation (i.e., Abrams tank, Tank Commander/Gunner crew positions) using advances in use of Augmented Reality, Virtual Reality, Augmented Virtuality, thru-sight tactical visualization, touch screens, motion tracking, software algorithms and models, and gaming technologies. The offeror will consider long-term requirements as defined by efforts such as the Synthetic Training Environment (STE). The offeror will conduct a statistically relevant set of experiments using the design and methodology to evaluate situational awareness, accuracy, and target acquisition performance measurement and experimentation developed in Phase I. The experimentation difficulty will vary from a novice level to an expert level of target acquisition, with the appropriate noise and blur applied to the imagery. Metrics will be developed and collected for evaluation of Soldier target acquisition performance under varying conditions, with and without enhanced SA.

DIRECT TO PHASE II (DP2): Offerors interested in submitting a DP2 proposal in response to this topic must provide documentation to substantiate that the scientific and technical merit and feasibility described in the Phase I section of this topic has been met and describes the potential commercial applications. The offerors related DP2 proposal will not be evaluated without adequate PH I feasibility documentation. Documentation should include all relevant information including, but not limited to: technical reports, test data, prototype designs/models, and performance goals/result. Please read the OSD SBIR 16.2 Direct to Phase II Instructions.

PHASE III DUAL USE APPLICATIONS: The offeror will work with available funding sources to transition capability into practical use within Army/DoD simulation systems, while consider options for dual use applications in broader domains including state/local governments, and commercial.

  • U. S. Army, Training and Education Modernization Strategy, 15 December 2014.
  • Live, Virtual, Constructive Integrating Architecture Initial Capabilities Document, 28 July 2004.
  • Aviation Combined Arms Tactical Trainer Increment II Capability Production Document, 02 December 2011.
  • Close Combat Tactical Trainer Reconfigurable Vehicle Tactical Trainer Capabilities Production Document, December 2006.
  • Close Combat Tactical Trainer Capability Production Document, 24 June 2009.
  • A Taxonomy of Mixed Reality Visual Displays, P. Milgram, F. Kishino, IEICE Transactions on Information Systems, Vol E77-D, No. 12 December 1994.
  • Windows on the World: An example of Augmented Virtuality, K. Simsarian , K-P. Akesson. 1997.
  • Usability Issues of an Augmented Virtuality Environment for Design, X, Wang, I. Chen 2010.
  • Supporting Cooperative Work in Virtual Environments S. Benford, J. Bowers, L.E. Fahlen, J. Mariani, T. Rodden. 1994.
  • Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & Macintyre, B. (2001). Recent advances in augmented reality. IEEE Computer Graphics and Applications IEEE Comput. Grap. Appl., 21(6), 34-47. doi:10.1109/38.963459
  • Brown, D., Coyne, J., & Stripling, R. (2006). Augmented Reality for Urban Skills Training. IEEE Virtual Reality Conference (VR 2006), 249-252. doi:10.1109/VR.2006.28
  • Goldiez, B., Livingston, M., Dawson, J., Brown, D., Hancock, P., Baillot, Y., Julier, S. (2005). Proceedings from the Army Science Conference (24th): Advancing Human Centered Augmented Reality Research. Orlando, FL. ARMY - 66
  • Hodges, G. (2014). Identifying the Limits of an Integrated Training Environment Using Human Abilities and Affordance Theory. Naval Postgraduate School, Monterey, CA.
  • Livingston, M., Barrow, J., & Sibley, C. (2009). Quantification of Contrast Sensitivity and Color Perception using Head-worn Augmented Reality Displays. 2009 IEEE Virtual Reality Conference, 115-122. doi:10.1109/VR.2009.4811009

KEYWORDS: virtual reality, augmented virtuality, modeling and simulation, synthetic training environment, interfaces, LVC, combat vehicles, aviation simulation

 

US Flag An Official Website of the United States Government