You are here
SONICSensor Operations via Naturalistic Interactive Control
Phone: (781) 496-2430
Phone: (781) 496-2443
As analysts and operators move from data to insights, tools are needed for supervisory control, command and control, and intelligence analysis. Intelligence, Surveillance, and Reconnaissance (ISR) requires the ability to navigate and interpret mounds of data to produce actionable decisions. Through the Urban Telepresence program, the Air Force Research Laboratory (AFRL) is redefining a concept of operations for ISR operations by enabling remote, virtual operators to interact with operational environments without being physically present. However, redesigning this workflow requires advancements to human-machine interfaces. To support this need, the Aptima team is developing the Sensor Operations via Naturalistic Interactive Control (SONIC) platform. SONIC is a multimodal user interaction framework optimized for use within highly immersive and data-rich environments to provide an intuitive, naturalistic way for users to interact and collaborate with distributed sensors, unmanned systems, and teammates in the operational environment. SONIC integrates an immersive multimodal workstation with a context-driven interaction service, and is built on top of scientifically-grounded human-machine interface guidelines for hybrid reality environments. Ultimately, the objective of SONIC is to enable analysts and operators to provide mission support in real-time from remote locations more effectively, without an increase in workload or a decrease in performance.
* Information listed above is at the time of submission. *