You are here

Sensory System to Transition Pilots From Aided to Unaided Vision During Flight to Mitigate Spatial Discordance



TECHNOLOGY AREA(S): Air Platform, Human Systems


The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop a system to seamlessly transition pilots from aided to unaided vision while performing night operations.

DESCRIPTION: When pilots transition from aided to unaided vision during flight, the number of visual cues that can be used as reference for aircraft attitude is greatly reduced. If this occurs during night operations with very low ambient light, spatial discordance can occur. Rapid transition from aided to unaided vision reduces the number of peripheral visual cues from many to few, which can lead to spatial disorientation and unsafe flight. Dark adaptation, or the ability to perceive low-level light, can take as long as half an hour [1, 2] to achieve. Other cues that indicate the attitude of the aircraft must be made present to mitigate the effects of night-vision aides on the visual system, where a light-adapted eye must quickly transition to extremely dark conditions.

A lack of sufficient peripheral visual orientation cues may lead to a number of spatial discordance issues (e.g., black-hole effect) [4]. Peripheral visual cues are reduced during a dark night or white-out (atmospheric or blowing snow) conditions. In either case, it is the lack of peripheral visual cues that lead to disorientation. Another situation in which pilots require peripheral visual cues is when approaching and closing in on another aircraft (e.g., in-flight refueling). Pilots use peripheral cues to estimate their relative position to the Earth and the aircraft to which they are approaching [4]. Without this peripheral information, as it occurs in extremely dark conditions, closing in on another aircraft becomes significantly more challenging and potentially dangerous. Currently, pilots rely on the plane’s attitude indicator, a visual representation of the plane’s position relative to the horizon, when experiencing spatial discordance. This visual cue provides information to the foveal visual field and does not take advantage of the benefits of cuing peripheral sensory receptors. Although this information is quite salient in the foveal visual field, pilots report dismissing this information since the vestibular cues they experience provide more compelling evidence of their (incorrect) spatial orientation.

As previously mentioned, peripheral visual cues are a major contributor to maintaining straight and level flight and avoiding spatial discordance. More recent research, however, has demonstrated that spatial information can be improved with multimodal (i.e., vision, hearing, tactile) stimulus presentation [1,3]. With the appropriate combination of more than one stimulus modality, humans can orient themselves more quickly and accurately than with the activation of one sensory modality alone [1,3]. Rupert (2000) demonstrated that vibrotactile arrays can provide enough situational awareness for helicopter pilots to navigate some maneuvers while blindfolded [5].

Technology with the ability to provide a pilot transitioning from aided to unaided flight, additional stimuli to maintain straight, level, and safe flight is needed. This technology can use any stimulus modality or use a multimodal approach. It should be able to be activated at the pilot's discretion and suitable for different platforms that have different requirements and constraints. At a minimum, however, this technology should be applicable to Navy 5th generation fighter aircraft. Since the only 5th generation fighter in the current inventory is the F-35 Lightning II, this technology should be compatible with the current cockpit design and successfully integrate with the baseline pilot-vehicle interface (PVI).

No additional weight should be added to the helmet; some possible solutions may involve adding devices to the helmet, which is not permitted. If power is required, it must be limited to the accessory power generated by the aircraft. If possible, the technology should extend to previous generation fighters and other aircraft (e.g., helicopters) – relevant aircraft cockpit specifics will be provided, as needed, during the development of this technology. Although the vibrotactile approach demonstrates some promising research avenues, Fourier transform analyses suggest wide encompassing of resonant frequencies within the cockpit that can prove problematic for the frequency at which vibrotactile arrays provide situational awareness. If a vibrotactile solution is proposed, it is necessary to convey the (1) distinctiveness of the approach; (2) the durability of the system (e.g., sturdiness after cleaning, lifetime strength); and (3) mitigation of resonant frequency issues in the cockpit.

Collaboration with original equipment manufacturers, (OEMs) in all phases is highly encouraged to assist in defining aircraft integration, commercialization requirements, and providing test platforms.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.

PHASE I: Develop and prove feasibility of an approach that demonstrates the ability for a pilot to orient themselves more quickly and accurately than current technology allows. Provide documentation that demonstrates the suitability of the design into representative platforms and mission environments; platform and mission environment data to be provided by the government upon award. A proof of concept demo should be performed along with a Technology Readiness Level (TRL)/Manufacturing Readiness Level (MRL) assessment.

PHASE II: Develop the system into a prototype, perform further testing in a relevant environment, and demonstrate performance in a simulated or actual flight environment. Tests during this phase should demonstrate the superiority of the new system compared to the standard avionics used during spatial discordance. Feasibility of aircraft/fighter integration should also be demonstrated. TRL/MRL assessment should be updated.

PHASE III DUAL USE APPLICATIONS: Perform final testing to the system in an actual flight environment to prepare for integration into both naval and commercial platforms. Aid the Navy in transition and integration of the system into the Fleet and all appropriate testing-and-evaluation programs. Private Sector Commercial Potential: This system would be useful in the private sector civilian aviation as spatial discordance has been found to be a large contributor to civilian mishaps as well.


  • Calvert, G. A., Spence, C., & Stein, B. E. (2004). The Handbook of Multisensory Processes. MIT Press
  • Bear, M. F., Connors, B. W., & Paradiso, M. A. (2006). Neuroscience: Exploring the Brain, 3rd Edition. Lippincott, Williams, & Wilkins
  • Bertelson, P. & Radeau, M. (1981). Cross-Modal Bias and Perceptual Fusion with Auditory-Visual Spatial Discordance. Perception & Psychophysics, 29(6), 578-584.
  • Gillingham, K. K. & Previc, F. H. (1993). Spatial orientation in flight. (No. AL-TR-1993-0022). ARMSTRONG LAB BROOKS AFB TX
  • Rupert, A. H. (2000). Tactile Situation Awareness System: Proprioceptive Prostheses for Sensory Deficiencies. Aviation, Space, And Environmental Medicine, 71(9 Suppl), A92-9

KEYWORDS: Spatial orientation; spatial discordance; peripheral cues; vision; multisensory; sensory system

US Flag An Official Website of the United States Government