You are here

Augmented Reality User Interfaces for Tactical Drones

Description:

DIRECT TO PHASE II

TECHNOLOGY AREA(S): Electronics, Human Systems

OBJECTIVE: Design and fabricate an Augmented Reality (AR) user interface for tactical air and ground vehicles that demonstrates minimal formal Soldier training, embedded Soldier training, and minimal Soldier cognitive burden during semi-autonomous ground and air tactical vehicle operations for acquiring image products, performing area reconnaissance, and performing remote sensing of airborne chemical, biological radiological, or nuclear toxins.

DESCRIPTION: The DoD has a critical need for breakthrough user interface technologies in order to plan and monitor the acquisition of mission critical image products and remote sensing, while enabling the Soldier to maintain their focus on primary tactical operations. This topic seeks to integrate state-of-the-art augmented reality user interface display content and human computer interface technologies with existing ground Soldier communications interfaces for training, embedded training, mission command, and semi-autonomous vehicle route planning and operations monitoring and controlling. This topic is open to a multiplicity of AR user interface architectures that first and foremost, demonstrate significant improvements in minimizing Soldier training and operational cognitive burden for monitoring and controlling tactical semi-autonomous vehicles, and secondly integrate with existing Nett Warrior interface standards including android operating system for the operating system, MIL-STD 2525B for mission command graphics, H.264 for video, Joint Architecture for Unmanned Systems (JAUS) for tele-robotic communications and Cursor on Target eXtended Markup Language (CoT XML) for robotic waypoint and route control.

The gaming and computing industry has pushed advances in the fidelity and daylight visibility of AR display hardware. These advances have enabled the probable use of AR displays for ground Soldiers. However, the time lag between AR hardware advancements, AR user interface content and user interface controls that are tactically relevant to ground Soldiers continues to be lengthy. Developing and demonstrating an AR display concept and style guide for semi-autonomous ground and aerial vehicles that leverages current mission command graphics and commercial advances in direct view AR graphics should yield a minimally cognitive burden Soldier experience.

Similarly, the gaming industry has pushed advances in the fidelity and user experience for control of the gaming experience but the ground Soldier tactical equipment has not had similar advancements. Voice commands, head gestures, virtual joysticks, or other emerging user input devices are needed to enable ground Soldiers to operate in a near hands-free posture as much as possible in order to remain in tactical, hands-on weapon posture when needed. Additionally, while tactical aerial vehicle operation has become more routine with the advent of control loops to automatically maintain desired height above ground, the current training time and on-demand training technologies are archaic. This topic also seeks the development of the same operational AR user interfaces and controls to provide formal and embedded aerial and ground vehicle operations and mission management training. This topic should leverage existing mission command satellite imagery and digital terrain elevation data; physical models of vehicle mobility and payload operations; and AR user interfaces and computer input devices to provide a train as you fight training prototype for tactical vehicles.

Proposals should target the design, development and demonstration of AR user interface components and Soldier input device components. Essential elements of the AR user interface components include low cognitive burden for the three phases of operation: training, planning, and operating the tactical ground and aerial vehicles. The essential elements of the control input are near hands-free operation, low cognitive burden and high Soldier acceptance for managing tele-robotic operations as well as mission operations.

Critical to the design of the system is minimizing Soldier cognitive burden while maximizing mission performance. In addition, proposals should detail the intended AR user interface components, (i.e. symbology, overlay style, notifications, FMV, training tools, and available functions), their interface design to robotic systems, computer input devices, mission messaging, and map data that will ultimately yield the lowest cognitive burden, lowest training time, and highest Soldier acceptance for vehicle control and mission image product generation. Offerors are to first uncover and understand the critical integration challenges that may limit the translation and commercial-viability of current AR user controls and AR content, symbols, and overlays.

Technical challenges may include:
• The development of a standard AR style for diverse user interface spectrum including tele-robotics, image product collection, remote toxin sensing, and mission status.
• The development of a spectrum of input controls for tactical vehicle control operation using AR displays.
• Development of high fidelity vehicle performance metrics to ensure training environment adequately mimics live vehicle operation.
• Establishing optimal trade-offs between head tracking, FMV processing, AR content overlay, and control inputs required to minimize the real time delay between external, physical environment and AR displayed content.

PHASE I: Explore and determine the fundamental feature list, sub-systems integration, and cognitive burden limitations in implementing a fully-integrated AR user interface for Soldier deployed, ground and aerial tele-robotics and autonomous mobility and payload control including embedded AR training mode. Phase I deliverables are a final report and proof of concept demonstration. The Final Report should identify: the AR user interface features for robotics control and embedded training; the feature list and ergonomic limitations of computer human input devices for controlling the wearable AR system; the technical challenges, relevant modular and extensible physics based control modeling of tactical ground and aerial semi-autonomous vehicle mobility and payload control; and the feature list and limitations of AR based embedded training for Soldier deployed, ground and aerial tele-robotics and autonomous control. The demonstration deliverable should include a proof of concept system that shows the key AR display and user control components in a bench-top prototype, for either a tactical ground or aerial vehicle along with all the design documents and complete specifications, along with documentation of committed sources and service providers for the fabrication of the ultimate integrated AR vehicle and payload control as well as the embedded AR training system to be produced in Phase II; full specifications and a complete Bill of Materials are required, itemizing each component and system that comprises the final prototype system. This demonstration should be performed at the contractor’s facility.

PHASE II: Development, demonstration, and delivery of a working, fully-integrated AR user interface for ground and aerial tele-robotics and autonomous mobility including training mode. The Phase II demonstration should operate within the existing set of ground Soldier interface standards: Universal Serial Bus (USB) 2.0 for peripheral electronic integration, H.264 for video, JAUS for tele-robotic communications, and Cursor on Target eXtended Markup Language (CoT XML) for autonomous waypoint commands. The Phase II final deliverables shall include (1) full system design and specifications detailing the AR user interface concept software (executable and source code) to be integrated for achieving the three mission sets of reconnaissance, terrain mapping and remote sensing; (2) full system design and specifications detailing the electronics and software (executable and source code) for AR Soldier control device(s) to be integrated; (3) full system design and specifications detailing the embedded training software (executable and source code) and details of the aerial aerodynamic physics models and configuration parameters; and (4) full system design and specifications detailing the embedded training software (executable and source code) and details of the ground mobility physics models, gripper physics models, arm physics models and camera models and the associated configuration parameters for each.

DIRECT TO PHASE II (DP2): Offerors interested in submitting a DP2 proposal in response to this topic must provide documentation to substantiate that the scientific and technical merit and feasibility described in the Phase I section of this topic has been met and describes the potential commercial applications. The offerors related DP2 proposal will not be evaluated without adequate PH I feasibility documentation. Documentation should include all relevant information including, but not limited to: technical reports, test data, prototype designs/models, and performance goals/result. Please read the OSD SBIR 16.2 Direct to Phase II Instructions.

PHASE III DUAL USE APPLICATIONS: Refine and mature AR user interface software applications for military reconnaissance and commercially for real estate, disaster relief and other reconnaissance operations. Refine prototype hardware and associated ergonomics for AR user interface control hardware to be used in Military and Department of Homeland Security, and disaster relief environments. Refine, and mature AR embedded training software applications for Military, Department of Homeland Security, and disaster relief types of tactical ground and aerial vehicles.

  • Gagnon, S. A., Brunye, T. T., Gardony, A. L., Noordzij, M. L., Mahoney, C. R., & Taylor, H. A. (2014). Stepping into a map: Initial heading direction influences spatial memory flexibility. Cognitive Science, DOI 10.1111/cogs.12055.
  • McCaney, Kevin. " Army’s move to Samsung reflects a flexible mobile strategy." Defense Systems, 24 Feb 2014.(https://defensesystems.com/articles/2014/02/24/army-nett-warrior-samsung-galacy-note-ii.aspx)

KEYWORDS: augmented reality, human factors engineering, ergonomics, training, prototype

 

US Flag An Official Website of the United States Government