OBJECTIVE: This objective seeks innovation to improve Situational Awareness displays that optimizes human interface for Navy personnel, ships, and crafts. DESCRIPTION: This topic proposes an innovative display that conforms to operator space such that the sensors ultimately become an extension of the operator"s visual experience. In a situational awareness (SA) system, the software, controls, and sensors are currently available and progressing; the limiting factor of system is the display. The sensors are moving toward replicating and exceeding human vision, while displays have gaps. Human vision, sensors, and displays are defined by resolution and field of view (FOV). The resolution of human vision is defined as 512 mega pixels (MPs) which similar to packing the resolution of approximately 246 high definition televisions into a display. The human vision FOV is 180-230 degrees in the horizontal direction and 120-135 degrees in the vertical direction. The EO/IR sensors provide a virtual reality of color and full motion video that the operator must see as if in the natural world. The SA sensors have an omni-directional FOV of which the operator needs to be able to resolve the details to detect, track, classify, identify, and target threats. The display must accommodate the natural function of the eye such that eyestrain is minimized, just as in the natural world. Threats exist in both the natural and virtual threat world. Having a spatial awareness of the natural physical surroundings is necessary during mission conflicts while performing operator functions; togging between virtual reality and reality is essential to both job function and survival (Ref 4). The display configuration allows the user to persistently scan the entire threat area in a more natural way than available with existing display technologies while accommodating space limitations and viewer equipment such as night vision goggles and eye glasses (Ref 1, 2, 3). Another way that human vision, sensors, and displays are defined is Instantaneous Field of View (IFOV) across the entire FOV of the omni-directional areas; which is defined in radians of a sphere instead of degrees. The IFOV can refer to a wide FOV (WFOV) similar to taking in the entire scene and narrow FOV to focus on a particular point or threat. Sensors are moving toward the IFOV range of 600 microradians and roadmapped to 100 microradians for WFOV and a narrow FOV of less than 10 microradians; all in high resolution formats. Sensors with these wide FOVs require partner displays. While an EO/IR sensor system may collect information that is beyond the human vision FOV range, the display must have accommodations for an operator to see information in a natural way, as if turning the head around like an owl or toggling between different scenes much as a person glances over their shoulder to see what is behind them. The orientation of the sensor virtual world to the physical world must be accommodated to avoid confusion about the direction of potential threats. Elimination of a display altogether is not an option as automatic target recognition algorithms miss targets, thus necessitating an operator in-the-loop. Today"s plethora of displays is not adequate for current and future sensor operations, especially in resolution and FOV, thus becoming the limiting factor in a system (Ref 2). Eliminating color triples the resolution in displays such as the Coronis Fusion 10 MP (Ref 5); however, SA displays require both color and even higher resolutions. Current industry solutions of increasing display size and/or tiling the displays to support WFOV and high-resolution SA imagery (Ref 3) do not fit into operator workspaces. Tiling micro-displays in head mounted displays (HMD) and waveguide visor technologies such as the BAE Systems LiteHud (Ref 6) creates weight issues. FPDs and flexible body displays, such as the Raytheon"s Aviation Warrior (Ref 6), produce fatigue from keeping the body position stable and have line-of-sight gaps that limit persistent SA. See-through visor displays, FPDs, and HMDs are designed for augmented reality (symbology over natural background); they do not accommodate SA virtual reality over natural background as the combined color and shapes cancel the ability to distinguish threats. These limitations affect warfighter SA by presenting only a subset of sensor information (Ref 4). The envisioned system will fill the gaps between current practices while supporting the solved issues such as latency (displays contribute minimal latency to a sensor system), sunlight readability, full motion video without blur, readable with night vision equipment, inherently rugged to survive in military environments or have the ability to be ruggedized, compatible with multiple commercial standard and military controllers, color, and programmable to scan, glance, focus on objects. The envisioned display system will support software configurations for zoom windows for target areas of interest from the WFOV sensor and one or more separate Narrow Field of View (NFOV) EO/IR sensor systems. The envisioned display system will also support menus, standardized symbology, automatic track boxes, target cues from the combat system or other sensors, and orientation information providing the user with SA such as sensor line-of-sight/FOV relative to the platform heading, etc. This project will explore the optimal choice or integrated combination of conventional panel-mounted displays and head/helmet-mounted displays (HMD) (Ref 1) or other proposed solutions. We will evaluate appropriateness of HMD usage for unity magnification, head or eye-tracked line-of-sight (LOS) imagery presentation, and as a heads-up pilotage and fire control targeting visual aid. We will also examine optimal methods of addressing spatial orientation mismatch between the operator and imagery line-of-sight (e.g., rear-view imagery). In addition, we will assess exploitation of innovative sensory presentation, cues and alerts such as pseudo-color overlays, sound, haptic feedback, etc. to bring out key features and minimize display clutter. The ability of a display"s fit into confined spaces while displaying the resolution and field of view (Ref 3) in a tradespace with power, and weight will also be assessed. A size, weight, and power combination that meet consumer energy star standards will result in a solution to this topic in a lighter weight form factor than current fielded solutions. The display will be assessed in phase I for the ability to build a prototype in phase II; phase III is a transition of technology for naval use and therefore manufacturing issues must be evaluated and minimized. PHASE I: The company will develop a concept for an EO/IR SA Display System that meet the requirements described above. The company will demonstrate the feasibility of the concept in meeting Navy needs and will establish that the concept can be successfully developed into a useful product. Feasibility will be established through analytical modeling, simulation of the Graphical User Interface (GUI) using simulated or real imagery, and/or prototype evaluation and demonstration using representative material samples. The small business will provide a Phase II development plan that must address technical risk reduction and provide performance goals and key technical milestones. PHASE II: Based on the results of Phase I and the Phase II development plan, the small business will develop a functional prototype for evaluation, demonstrating the full performance level. The prototype will be evaluated to determine its capability in meeting the performance goals defined in Phase II development plan and the Navy requirements for the EO/IR SA Display System. System performance will be demonstrated through prototype evaluation and modeling or analytical methods over the required range of parameters. The EO/IR SA Display System will be demonstrated and evaluated using an interfacing panoramic WFOV EO/IR imaging system and accompanying high definition narrow FOV EO/IR imaging system that work together in tandem to supply the WFOV and NFOV Zoom imagery. Evaluation results will be used to refine the prototype into an initial design that will meet Navy requirements. The company will prepare a Phase III development plan to transition the technology to a Navy program. PHASE III: The company will be expected to support the Navy in transitioning the technology to Navy use. The company will refine the EO/IR SA Display System according to the lessons learned from Phase II development for evaluation to determine its effectiveness in an operational environment. The company will support the Navy in test and validation to certify and qualify the system for Navy use. PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Private Industry commercial products can benefit from this research in applying resulting concepts to develop higher performance and less expensive higher resolution and wider FOV displays. EO/IR systems are used extensively in the commercial sector all of which require display and all are moving toward the Human Vision sensor inputs which are dual-use applications. Multiple commercial, industry, and military applications benefit from the implementation of this SBIR topic. Video game players"needs are similar in terms of resolution, FOV, and the need to toggle quickly between physical and virtual reality (4). Medical requirements would gain from resolution, physical to virtual toggles, and confined spaces (such as a surgeon in an operating room). Other industry and commercial applications are facilities security, CAD (Computer Aided Drafting) development, training, simulators, virtual tours, law enforcement, border patrol, auto industry, and the film industry. REFERENCES: 1. Melzer, James E."HMDs as enablers of situation awareness: the OODA loop and sense-making"SPIE 8383, 01 May 2012,
2. Fulton, Jack E."Display technology gaps used with electro-optic sensors"SPIE 8042 13 May 2011, 3. Petty, Gregory J., Seals, Ean J., Fulton , Jack E."Display challenges resulting from the use of wide field of view imaging devices"SPIE 8383 1 May 2012, 4. Nicholson, Gail M."Alternatives to flat panel displays in vehicle turrets"SPIE. 8042 13 May 2011, 5. Warwick, Graham."BAE Offers Slim fit HUD for Advanced Cockpits". Military.com. Aviation Week"s DTI. 23 July 2012. 16 November 2012. 6. Tam, Donna."Pilot of the Future: U.S. Army gets wearable tech for the battlefield". CNET, News, Military Tech. 23 July 2012. 16 November 2012.