You are here

Intelligent fly-by-feel systems for autonomous aircraft

Award Information
Agency: Department of Defense
Branch: Air Force
Contract: FA8649-21-P-1617
Agency Tracking Number: FX20D-TCSO1-0161
Amount: $749,999.00
Phase: Phase II
Program: STTR
Solicitation Topic Code: AFX20D-TCSO1
Solicitation Number: X20.D
Solicitation Year: 2020
Award Year: 2021
Award Start Date (Proposal Award Date): 2021-08-09
Award End Date (Contract End Date): 2022-11-09
Small Business Information
835 Stewart Drive -
Sunnyvale, CA 94085-1111
United States
DUNS: 043688410
HUBZone Owned: No
Woman Owned: Yes
Socially and Economically Disadvantaged: Yes
Principal Investigator
 Amrita Kumar
 (408) 745-1188
Business Contact
 Irene Li
Phone: (408) 745-1188
Research Institution
 Stanford University
 Michael Hitchcock
382 Via Pueblo Mall
Stanford, CA 94305-4060
United States

 (650) 721-6910
 Nonprofit College or University

Future intelligent autonomous vehicles like the Orb/eVTOL/UAM (Electric Vertical Takeoff and Landing/Urban Air Mobility) vehicles will be able to “feel”, “think”, and “react” in real time by incorporating high-resolution state-sensing, awareness, and self-diagnostic capabilities. They will be able to sense and observe phenomena at unprecedented length and time scales allowing for superior performance in complex dynamic environments, safer operation, reduced maintenance costs, and complete life-cycle management. Despite the importance of vehicle state sensing and awareness, however, the current state of the art is primitive as well as prohibitively heavy, expensive, and complex. Therefore, new Fly-by-Feel technologies are required for the next generation of intelligent aerospace structures that will utilize AI to sense the environmental conditions and structural state, and effectively interpret the sensing data to achieve real-time state awareness to employ appropriate self-diagnostics under varying operational environments. Acellent is teaming with Stanford University, USAF and The Boeing Company in this STTR project to develop a Fly-by-Feel (FBF) autonomous system to significantly enhance agility of drones by integrating directly on the wings a nerve-like stretchable multimodal sensor network with AI-based state sensing and health diagnostic software to mimic the biological sensory systems like birds. Once integrated with the wings, the distributed sensor data will be collected and processed in real-time through AI-based diagnostics for flight-state estimation in terms of lift, drag, flutter, angle of attack, and damage/failure of the component in real time so that the system can interface with the controller to significantly enhance the maneuverability and survivability of the vehicle. Phase 1 focused on manufacturing, integrating and testing the network in the laboratory environment. Phase 2 program will mature the technology to TRL 5-6 via integration with a UAV to flight test the complete UAV in a flight regime using a wind tunnel.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government