Maintaining Human-Machine Shared Awareness in Distributed Operations with Degraded Communications

Description:

TECHNOLOGY AREA(S): HUMAN SYSTEMS

OBJECTIVE: Develop and evaluate controls, displays, and/or decision aids that help maintain human-machine shared situation awareness during distributed operations conducted with manned and unmanned vehicles under possibly contested and degraded conditions.

DESCRIPTION:The importance of autonomy for realizing Air Force employment of multiple manned and unmanned teamed sensor platforms in future warfighting is well recognized. These new mixed-initiative interactive systems must enable human-machine collaboration and combat teaming that pairs a human’s pattern recognition and judgement capabilities with recent machine advances in artificial intelligence and autonomy to facilitate synchronized tactical operations using heterogeneous manned and unmanned systems. Agility in tactical decision-making, mission management, and control is also a requirement given anticipated complex, ambiguous, and time-challenging warfare conditions. For example, shifts from a centralized to a decentralized control structure, especially when communication links between the human and machine team members degrades, is plausible given unmanned vehicles will have onboard computational resources to flexibly serve as autonomous and capable teammates that can complete needed tasks. The envisioned distributed and networked operations will complicate human-machine coordination, especially when communications are intermittent, degraded, and/or delayed. This is in addition to the challenges of achieving multi-domain situational awareness/command and control.Current control station interface designs do not support information sharing and coordination to synchronize human-machine awareness whenever communications are restored. Improvements are necessary to realize effective human-machine teaming performance in mission operations, especially when alternating between centralized and decentralized control modes. Controls, displays, and decision support services are needed for the human operator to efficiently retrieve integrated contextual data that helps rapid restoration and maintenance of a shared understanding of relevant information that supports human-machine joint problem-solving, effective decision making, and ultimate task/workload balancing. This will require agent-assisted methods to identify critical mission events and associated information gaps, as well as intuitive interfaces by which the human and machine can rapidly gain shared situation awareness and dynamically coordinate any adjustments needed in the vehicles’ operations, with respect to the temporal, spatial, and mission relevant demands. Supplementing the control station with interfaces and services to restore and maintain situation awareness will result in more resilient operations. In sum, the controls and displays in the operator’s control station need to support human-machine shared awareness during distributed, disaggregated operations under a variety of communication conditions.Completion of this effort will involve identifying control and display requirements to support human-machine teamwork (i.e., cooperative tasking) for agile, efficient mission execution. This should include an analysis of requirements for a variety of communication conditions, as the design approach likely is situation dependent. For example, relevant questions include: What techniques can be employed to help keep the operator in-the-loop during communication loss? How best should the interfaces identify information gaps, present the machines’ actions during lost communications, and cue evolving collaboration/cooperation opportunities? How should communications be prioritized for re-establishing common ground after communications resume? What interaction modes/strategies are useful for supporting subsequent human-machine joint decision-making and task planning/execution? What mechanisms are best to specify alternatives, perhaps proactively, for different communication states/mission events?This effort addresses the design and evaluation of interfaces that support human-machine shared situation awareness. Aside from addressing (simulated or real) human and autonomy team members completing multiple tasks under varying communication conditions, the proposer can choose systems/tasks/mission(s) to utilize, as long as the effort considers at least two air vehicles (manned and unmanned). (Any simulated or representative system employed should maintain data at an unclassified level. Proposers should not require government equipment or facilities.)

PHASE I: Design/evaluate displays, controls, and/or decision aids to support operator-machine teaming to maintain shared awareness of manned and unmanned air vehicles operations with limited communications. Generate final report describing solution(s), evaluation results, and an experimental plan to establish usability improvements in Phase II. A feasibility demonstration is desirable, but not required.

PHASE II: Perform iterative test/refine cycles on Phase 1’s design, culminating in a proof-of-concept interface/decision support system. Using high-fidelity simulations, evaluate prototype’s effectiveness in maintaining human-machine shared awareness during distributed, disaggregated operations under a variety of communication conditions. Required Phase II deliverables include final report and software/hardware required to demonstrate the interface concept on a stand-alone capability and/or suitable to be executed in a USAF simulation that is mutually agreeable to the contractor and AFRL.

PHASE III: Applications include planning and executing any military or commercial (e.g., law enforcement) plan using highly autonomous unmanned vehicles in decentralized operations with limited communications. Some interfaces and methodologies will be applicable to other human-machine teaming applications.

REFERENCES: 

1.United States Air Force. (2015), Air Force Future Operating Concept: A View of the Air Force in 2035. Available at: http://www.af.mil/Portals/1/images/airpower/AFFOC.pdf.;

2.United States Air Force. (2015), Autonomous Horizons: System Autonomy in the Air force – A Path to the Future, Volume 1: Human-Autonomy Teaming. USAF Office of the Chief Scientist, AF/ST-TR-15-01.;

3.Patzek, M., Rothwell, C., Bearden, G., Ausdenmoore, B., and Rowe, A (2013). Supervisory control state diagrams to depict autonomous activity. In Proceedings of the 2013 International Symposium on Aviation Psychology, Dayton, OH.;

4.Draper, M., Calhoun, G., Hansen, M., Douglass, S., Spriggs, S., Patzek, M., Rowe, A., Evans, D., Ruff, H., Behymer, K., Howard, M., Bearden, G., Frost, E. (2017). Intelligent multi-unmanned vehicle planner with adaptive collaborative control technologies (IMPACT). International Symposium of Aviation Psychology.

KEYWORDS:unmanned vehicle, human-machine interface, situation awareness, decision support, intelligent agent, communication, distributed operations, human-machine teaming

CONTACT(S):Lt TylerGoodman 711 HPW/RHCI 9377137150 tyler.goodman.3@us.af.mil

Agency Micro-sites

SBA logo
Department of Agriculture logo
Department of Commerce logo
Department of Defense logo
Department of Education logo
Department of Energy logo
Department of Health and Human Services logo
Department of Homeland Security logo
Department of Transportation logo
Environmental Protection Agency logo
National Aeronautics and Space Administration logo
National Science Foundation logo
US Flag An Official Website of the United States Government