USA flag logo/image

An Official Website of the United States Government

Situation Awareness Information Fusion: ANDRO's InFuSA System - A 4D Modular…

Award Information

Agency:
Department of Defense
Branch:
Air Force
Award ID:
52346
Program Year/Program:
2001 / SBIR
Agency Tracking Number:
011IF-1334
Solicitation Year:
N/A
Solicitation Topic Code:
N/A
Solicitation Number:
N/A
Small Business Information
ANDRO Computational Solutions, LLC
Beeches Professional Campus 7980 Turin Road, Bldg. 1 Rome, NY 13440-1934
View profile »
Woman-Owned: No
Minority-Owned: No
HUBZone-Owned: No
 
Phase 1
Fiscal Year: 2001
Title: Situation Awareness Information Fusion: ANDRO's InFuSA System - A 4D Modular Virtual Crew Station Applying Visual/Auditory Display Surround For Rapid
Agency / Branch: DOD / USAF
Contract: F30602-01-C-0109
Award Amount: $99,174.00
 

Abstract:

New fighter/bomber air defense crews and airborne C4I surveillance operators encounter ever-increasing amounts and types of real time information from on- and off-board sensors and other information sources. The problem is complicated by the increasingnumber of off-board military platforms providing situation awareness (SA) information on hundreds of different targets and tracks, and the assessment of merged/processed data from multiple sources. For instance, the SA of the complex battlespace for groundmoving vehicle time critical targets including detecting tanks under trees involves the assessment of large amounts of real time, near real time, and non real time data. These growing demands place undue information processing requirements on aircrews andoperators that can lead to spatial disorientation, loss of situational awareness, cognitive overload, and delayed reactions during emergency or transient conditions. This forces personnel to function as human data integrators rather than as decisionmakers. These potentially dangerous conditions underscore the need to supply operators or aircrews with manageable amounts of high-quality information. One method for accomplishing this goal is to address the requirement of providing a single SA picture tothe crew. This visualization will be customizable to each individual's needs providing different levels of details, but with a single data set. Customization can be achieved with data mining, contact layer drill down, ability to aggregate or dis-aggregateinformation. The single data set ensures synchronization and a reliable measurement of uncertainty of time. This requires the development of a new capability to effectively fuse information and present an integrated SA picture that exploits modernvisualization/display and multi-modal user control technologies to facilitate ease of interaction with on-board information systems and support rapid decision making.Current human-computer interfaces (HCIs) are often the bottleneck in effectively and efficiently utilizing the available information flow for decision making. This is increasingly true with greater demands for processing speed, throughput, and datastorage. Recent advances in sensor fusion and display technology can be synergistically exploited to develop novel HCIs and control systems to reduce confusion on the part of aircrews of the future. Information rich environments require a user-centeredapproach that adapts to heterogeneous changing connections of information sources and devices with multiple modalities such as gesture/speech recognition, eye gaze, lip reading, and even biofeedback mechanisms attached directly to crewmembers. Thesemulti-modal environments must also be interoperable. The integration of multi-modal inputs for human-machine interaction is approached from the viewpoint of multiple information source fusion where different information sources can be related to differentinterface modalities to establish an interoperable multi-disciplinary SA fusion system. This system would provide aircrews and operators from multiple disciplines (tactical C2, strategic surveillance, EW, etc.) and those interfacing with on-boardbomber/fighter platform (B-2, Joint Strikes Fighter, etc.) and off-board surveillance system (AWACS, Joint STARS) sensors, or with supporting DoD acquisition centers (B-2 and JSF SPOs) with a common set of augmented reality display formats and usercontrols. The new displays and multi-modal control schemes will be readily useable by a sensor to shooter cockpit warfighter, operator, or battlespace commander across various computer display platforms with little or no training. Traditional analoginstruments and gauges would be replaced with large, flat-panel multi-function displays that combine the functions of separate instruments into a multifunction workstation. A new digital cockpit would provide pilots and operators with equipment that ismore intuitive, easier to use in real time, and less expensive than today's avionics equipment. Much of this new technology would take maximum advantage of existing COTS technologies to lower development costs and increase commercialization potential. ThePhase I program objective is to develop and demonstrate interoperable SA using multiple HCI modalities capable of fusing information into a consistent operational picture. This will result in the development of a generalized architecture with appropriatemodels and techniques for SA information fusion called the Information Fusion for Situation Awareness (InFuSA). This initiative will develop and demonstrate a real-time data fusion architecture with appropriate models and techniques, merging data frommultiple sources such as Link-16, intelligence data links, and on-board sensors to form a single common SA picture. It will involve combining knowledge of the application domain with the ability to express application-specific actions, objectives, andqualifiers to improve the ability to conceptualize and visualize information to enhance collaborative decision making as well as continuously monitor and update present and future battlespace awareness states. The focus is on the development of a datafusion architecture within the context of an improved operator machine interface environment. The proposed InFuSA system, which will adapt the Air Force's

Principal Investigator:

Andrew Drozd
Owner/Chief Scientist
3153341163
andro1@aol.com

Business Contact:

Andrew Drozd
Owner
3153341163
androcs@borg.com
Small Business Information at Submission:

ANDRO CONSULTING SERVICES
Beeches Technical Campus, Bldg. 3, Ste. 4, Rt. 26N Rome, NY 13440

EIN/Tax ID: 161504394
DUNS: N/A
Number of Employees:
Woman-Owned: No
Minority-Owned: No
HUBZone-Owned: No