OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Computing and Software
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.
OBJECTIVE: The primary objective is to develop a robust and reliable visual navigation system that ensures uninterrupted drone operations in environments where GPS signals are unavailable, degraded, or subject to jamming. The proposed technology should utilize advanced computer vision algorithms to analyze visual data from the drone's onboard camera, detect and recognize skylines and terrain features, and match these features against a precomputed and preprocessed satellite data repository. This system must deliver high accuracy, with geolocation precision within five meters, ensuring mission success in challenging operational scenarios.
DESCRIPTION: This topic seeks to develop a software-only visual position and navigation capability using computer vision, tailored for deployment on commercial off-the-shelf (COTS) drones operating in GPS-denied environments. The desired solution should leverage existing cameras, storage, and computational resources on these drones to provide accurate, real-time navigation and positioning without the need for additional hardware.
PHASE I: In order to substantiate that the proposer's technology is currently at an acceptable stage to award a Direct to Phase 2 (D2P2) contract, a previously completed feasibility study is expected. This study should have demonstrated the technology's ability to address key requirements such as compatibility with a wide range of COTS drones, terrain feature detection and matching, data security and resilience against cyber threats, and feasibility of the technology through simulations and field tests. By providing evidence of a completed feasibility study that addresses these key requirements, the proposer can demonstrate that their technology is currently at an acceptable stage to award a D2P2 contract.
PHASE II: This topic seeks to develop a software-only visual position and navigation capability using computer vision, tailored for deployment on commercial off-the-shelf (COTS) drones operating in GPS-denied environments. The desired solution should leverage existing cameras, storage, and computational resources on these drones to provide accurate, real-time navigation and positioning without the need for additional hardware.
Deploying a visual navigation system on COTS drones significantly enhances the operational capabilities of the Air Force by providing a resilient alternative to GPS-based navigation. This software solution allows for rapid integration across various drone platforms, eliminating the need for specialized hardware modifications. The capability to maintain accurate positioning and navigation in GPS-denied environments is crucial for reconnaissance, surveillance, and logistics missions, particularly in contested or remote areas. By leveraging existing drone sensors and computing power, the proposed technology ensures cost-effective scalability and operational flexibility.
The proposed solution must be compatible with a wide range of COTS drones, utilizing their onboard cameras and computational resources to minimize additional weight and power consumption. The system should employ machine learning and computer vision techniques to achieve terrain feature detection and matching. It must be capable of operating under diverse environmental conditions, including urban canyons, dense foliage, and varied lighting. Additionally, the software should provide easy integration through an API, supporting rapid deployment and updates, and ensure data security and resilience against cyber threats. The solution should demonstrate the feasibility of the technology through simulations and field tests, showcasing the system's performance and reliability in relevant operational scenarios as well as integration with Android Tactical Assault Kit (ATAK).
PHASE III DUAL USE APPLICATIONS: The expected Phase III effort for this project would involve further development, testing, and refinement of the software-only visual position and navigation capability using computer vision. This would entail optimizing the software to leverage existing cameras, storage, and computational resources on commercial off-the-shelf (COTS) drones, ensuring compatibility with a wide range of drone platforms. The software would need to employ machine learning and computer vision techniques to achieve terrain feature detection and matching, with the ability to operate under diverse environmental conditions, such as urban canyons, dense foliage, and varied lighting. Additionally, the software should provide easy integration through an API, supporting rapid deployment and updates, and ensure data security and resilience against cyber threats.
The expected TRL at Phase III entry would be around TRL 6-7, indicating that the technology has been demonstrated in a relevant environment, and is ready for deployment in an operational environment. This would entail the successful integration of the software with various COTS drone platforms, as well as its compatibility with the Android Tactical Assault Kit (ATAK).
In terms of transition planning, the project would need to address regulatory compliance, such as ensuring the software adheres to data privacy and security regulations. Additionally, the project would need to consider the development of a business or transition plan, outlining the strategy for commercialization or broader adoption of the technology. This would include identifying potential markets, partners, and customers, as well as a plan for ongoing support, maintenance, and updates to the software. Furthermore, collaboration with drone manufacturers and operators would be crucial to ensure seamless integration and adoption of the technology.
Potential commercial and private industry applications for the proposed technology include precision navigation, reconnaissance, search and rescue, and commercial vision metadata tagging. The technology could be used to guide autonomous drones for surveillance and tactical support in military operations, improving situational awareness and mission success. In first responder cases, the technology could monitor and guide autonomous search and rescue equipment, improving safety and efficiency. In commercial vision metadata tagging, location accuracy within feet/meters is required for various applications, such as image geotagging and object tracking. By addressing these needs, the proposed technology has the potential to be a viable solution for various industries, providing a resilient alternative to GPS-based navigation. The technology's better vision and location capabilities without the use of a GPS could also lead to significant energy savings, reduce costs, and avoid sensitive areas such as airports or flying below certain altitudes for legal reasons.
REFERENCES:
1. J. Kim, T. Gregory, J. Freeman and C. M. Korpela, "System-of-Systems for Remote Situational Awareness: Integrating Unattended Ground Sensor Systems with Autonomous Unmanned Aerial System and Android Team Awareness Kit," SPIE Defense + Security, Baltimore, Maryland, United States, 2014, pp. 90750A-90750A-12.
2. F. Cappello, S. Ramasamy and R. Sabatini, "A low-cost and high performance navigation system for small RPAS applications," Aerospace Science and Technology, vol. 58, pp. 529-545, 2016, doi: 10.1016/j.ast.2016.09.017.
3. A. Appleget, J. Watson, J. Gray and C. Taylor, "Navigating a sUAS without GNSS," Inside GNSS, May 29, 2023. [Online]. Available: https://insidegnss.com/navigating-a-suas-without-gnss/
4. J. Kim, K. Lin, S. M. Nogar, D. Larkin and C. M. Korpela, "Detecting and Localizing Objects on an Unmanned Aerial System (UAS) Integrated with a Mobile Device," 2021 International Conference on Computing, Networking and Communications (ICNC), San Diego, CA, USA, 2021, pp. 546-550.
5. M. Uijt de Haag, S. Huschbeck and J. Huff, "sUAS Swarm Navigation using Inertial, Range Radios and Partial GNSS," 2019 IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), San Diego, CA, USA, 2019, pp. 1-8, doi: 10.1109/DASC43721.2019.9091029.
6. La and M. Matson, "ATAK Integration through ROS for Autonomous Air-ground Team," 2021 IEEE International Systems Conference (SysCon), Vancouver, BC, Canada, 2021, pp. 1–5, doi: 10.1109/SysCon48628.2021.9476676.
KEYWORDS: SUAS, ALT-PNT, Computer Vision on SUAS