You are here

Advanced Intuitive Interfaces

Description:

OUSD (R&E) MODERNIZATION PRIORITY: Autonomy TECHNOLOGY AREA(S): Human Systems OBJECTIVE: Develop ‘plug and play’ proprioceptive and vestibular interface technologies to improve the fidelity of immersive, rapidly-reconfigurable simulation environments. DESCRIPTION: Advances in increasing the fidelity of simulated environments and in generating novel control interfaces have mainly been in the visual and audio domains to the exclusion of other sensory systems. This creates a number of challenges: • It limits the potential for low-cost training systems to create a high-fidelity immersive experience; this may be a reason training with virtual reality (VR) and augmented reality (AR) have not shown consistent advantages over non-immersive methods in training, for instance (Kaplan, et al., 2021). • A mismatch between vestibular and visual systems creates motion sickness (Geyer & Biggs, 2018). • The visual and audio systems remain bottlenecks for information transfer between the system and the human operator. There are emerging methods for proprioceptive, tactile, and vestibular ‘displays’, however these mostly exist only in research prototypes or in low-fidelity gaming devices. The goals of this SBIR are to mature early, proof of principle prototypes of high-fidelity proprioceptive, tactile, or vestibular displays for low-cost and easily configurable simulation environments such as Unity, Robot Operating System (ROS), or Unreal. This SBIR seeks to develop and test new devices that enhance machine interfaces in the following categories: 1. Augment the sensory experience of low-cost simulation environments. Potential technologies of interest include but are not limited to galvanic vestibular stimulation to induce the sensation of movement, and other conductance-based displays; 2. Alternative methods for information transfer that are not in the visual or auditory domain. These may include tactile touchscreens (by static electricity or other means) that provide feedback on a system’s status such as whether a setting has been selected, or other methods that help address enhance overall situational awareness without further burdening the visual or auditory systems; 3. Other technologies to remove information transfer bottlenecks or improve the fidelity of low-cost simulation environments. In all cases, the purpose of the SBIR is to turn a pre-existing nascent, prototype technology into a high-fidelity commercial plug-and-play capability (e.g., USB, Bluetooth, etc.) so that they can be rapidly integrated into and tested in common VR, AR, and other immersive simulation environments. This means the resulting capability must include: • Specifications for integration with at least one widely available simulation engine and environment; • Schemas that translate simulated physical properties (e.g., yaw, pitch, and roll data) into a high-fidelity sensory display; • Calibration methods for adapting the device’s intensity to an individual’s needs and vestibular, tactile, and/or proprioceptive sensitivity levels; • Testing protocols to demonstrate efficacy compared to either simulation without the display or, existing low-fidelity prototype systems to demonstrate improvement according to program performance metrics detailed in Table 1. Proposals should include a power analysis for these studies to demonstrate the testing plan is sufficiently powered to test for the expected magnitude of effect. Table 1. AAI Expected Performance Metrics METRIC DETAIL 1. USABILITY The end-of-Phase II product should take not more than 30 minutes for an untrained user to first install and calibrate with a system, and no more than 5 minutes to re-calibrate for each subsequent use once installed. Proposal milestones should specify quantitative progress towards this threshold including expected performance against this metric at the end of Phase I. 2. FIDELITY & CAPACITY FOR INFORMATION TRANSFER The display’s fidelity and reconfigurability (rate and resolution at which the display changes) should approach the limits of human sensory discrimination ability (e.g., two-point discrimination for tactile displays). Proposers should specify and quantitatively justify this threshold, depending on display type, sensory system engaged and proposed use, with references. 3. CONTRIBUTION TO PERFORMANCE Proposers should demonstrate measurable performance improvements within an environment or use case of the proposer’s choice. For instance, for enhanced training, proposer should demonstrate the technology use increases speed of learning and/or speed of learning transfer from simulation to real-world environment. Proposers should specify use-case(s) and proposed performance comparisons for the use case(s) to demonstrate efficacy. PHASE I: Phase I efforts should mature a prototype to include commercial-quality features for a sensory display technology and demonstrate performance towards final program metrics. By the end of Phase I, teams will demonstrate the capability within a VR, AR, or other immersive simulation environment and should, at a minimum, conduct pilot testing in human subjects. Teams should be prepared for a demonstration at the end of Phase I in a simulation test environment. Schedule/Milestones/Deliverables Phase I fixed payable milestones for this program should include: • Month 1: Initial device design review, including detailed description of point of departure, system requirements, draft specifications, software development plan (including expected progress against metrics at each 6-month interval) and system integration considerations. Initial IRB protocol submitted to the performer’s local Institutional Review Board (IRB). • Month 3: Status report on progress against objectives, including status against prototype maturation plan. Proposed evaluation use-case, test scenario, and performance metrics for Phase II performance assessment. • Month 5: Status report on progress against objectives, including status against prototype maturation plan. Updated system design, system specifications, and software description/documentation as appropriate. Progress against Metrics 1 and 2. • Month 8: Status report on progress against objectives, including status against prototype maturation plan. Updated system design, system specifications, and software description/documentation as appropriate. Workflow description for integrating the display with simulation environments and scenarios. • Month 10: Final Phase I Report summarizing prototype design and construction and evolved system characteristics, performance against evaluation metrics, updated integrated workflow description. Prototype demonstration. Delivery of prototype software to DARPA and necessary documentation. Results of pilot HSR study and progress against Metrics 1, 2 and 3. Phase II IRB protocol submitted and approved by performer’s local IRB, and Human Research Protections Officer (HRPO) review package completed and ready for submission. Proposers interested in submitting a Direct to Phase II (DP2) proposal must provide documentation to substantiate that the scientific and technical merit and feasibility described above has been met and describes the potential military and/or commercial applications. Documentation should include all relevant information including, but not limited to: technical reports, test data, prototype designs/models, and performance goals/results. PHASE II: Phase II efforts should refine the prototype system developed in Phase I (1 for 1) rapid integration (e.g., ‘plug and play’) into a VR, AR, or other immersive simulation environment, 2) calibrate and optimize the system’s fidelity and information transfer schemas to ensure clarity and to maximize the capacity for information transfer with the display, and 3) conduct controlled, well-powered human subjects research (HSR) to demonstrate efficacy in terms of a performer-chosen use case or scenario. Performers are strongly encouraged to include multiple rounds of HSR over the course of Phase II rather than conducting one large study at the end of the Phase. Schedule/Milestones/Deliverables Phase II fixed milestones for this program should include: • Month 1: For DP2 performers only: demonstration of existing prototype. • Month 2: Detailed workplan description outlining prioritized refinements and improvements necessary to generate a Minimum Viable Product (MVP) for commercialization. The plan should specify measures of performance (MOPs) on the critical path for achieving system efficacy. • Month 6: Report on progress and performance against metrics including HSR results as applicable. • Month 12: Report on progress and performance against metrics including HSR results as applicable. Update on commercialization plan and commercial engagement efforts. Demonstration of updated capability focusing on improvements since the end of Phase I demonstration (or beginning of Phase II demonstration for DP2 performers). • Month 18: Report on progress and performance against metrics including HSR results as applicable. • Month 24: Final Phase II report documenting final display design, system specifications, software and documentation, and instructions for integration. Demonstration that someone who is not a member of the development team, and without help from the developer team, can integrate the display into a simulation environment. PHASE III DUAL USE APPLICATIONS: The technologies developed under this SBIR program should have a strong potential for direct commercialization or integration into more complex DoD and commercial systems, such as ease of integration into Commercial Off the Shelf (COTS) VR, AR, and/or other simulation environments. Proposers should estimate what the program goals are in terms of final unit cost and set up/calibration time as part of the commercialization plan, including a justification that a market could/would support both figures. REFERENCES: 1. Aoyama, K., Iizuka, H., Ando, H., & Maeda, T. (2015). Four-pole galvanic vestibular stimulation causes body sway about three axes. Scientific reports, 5(1), 1-8. 2. Geyer, D. J., & Biggs, A. T. (2018). The persistent issue of simulator sickness in naval aviation training. Aerospace medicine and human performance, 89(4), 396-405. 3. Groth, C., Tauscher, J. P., Heesen, N., Hattenbach, M., Castillo, S., & Magnor, M. (2022). Omnidirectional Galvanic Vestibular Stimulation in Virtual Reality. IEEE Transactions on Visualization & Computer Graphics, (01), 1-1. 4. Kaplan, A. D., Cruit, J., Endsley, M., Beers, S. M., Sawyer, B. D., & Hancock, P. A. (2021). The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis. Human factors, 63(4), 706-726. 5. Li, X., Ma, Y., Choi, C., Ma, X., Chatterjee, S., Lan, S., & Hipwell, M. C. (2021). Electroadhesion‐Based Haptics: Nanotexture Shape and Surface Energy Impact on Electroadhesive Human–Machine Interface Performance (Adv. Mater. 31/2021). Advanced Materials, 33(31), 2170240. 6. Lu, J., Liu, Z., Brooks, J., & Lopes, P. (2021, October). Chemical Haptics: Rendering Haptic Sensations via Topical Stimulants. In The 34th Annual ACM Symposium on User Interface Software and Technology (pp. 239-257). 7. Sra, M., Jain, A., & Maes, P. (2019, May). Adding proprioceptive feedback to virtual reality experiences using galvanic vestibular stimulation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-14). 8. Teo, T., Nakamura, F., Sugimoto, M., Verhulst, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2020). Feel it: Using Proprioceptive and Haptic Feedback for Interaction with Virtual Embodiment. In ACM SIGGRAPH 2020 Emerging Technologies (pp. 1-2) KEYWORDS: human machine interface, display, augmented reality, virtual reality, simulation, training
US Flag An Official Website of the United States Government