You are here

Improved Weather Sensor Analysis Algorithms via Machine Learning

Description:

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Applicants must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Applicants are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: Air Force Weather develops, tests, fields, modernizes, and sustains fixed and deployed ground-based weather sensor systems at locations around the world. Recent and near-term upgrades to tactical and fixed-base sensors include added digital sky cameras, higher resolution in-situ sensors, and data aggregation in a cloud-based platform. As an exploitation gap example, the digital sky cameras are currently only exploited by manual, human-visual processes and informally. Employing machine learning to build upon and create new weather sensor algorithms has great potential to provide additional and/or streamline local area environmental intelligence and to increase fidelity of understanding environmental impacts to operations in planning and execution. This intelligence is cumulative and adds to a global understanding of the environment, including accuracy/fidelity of regional and global weather physics and machine learning-based models. DESCRIPTION: Describes what limitations and constraints this solution will need to operate under (ie nuclear certification): Processing and data of sensor data at the local level is limited to non-server based compute and imbedded firmware processors. Processes and tech stack will need to be established to optimally aggregate sensed data for machine learning training. What is the minimum desired Technology Readiness Level (TRL)? TRL 3 (Analytical and experimental critical function and/or characteristic proof of concept) What resources do you have? (i.e. Gov data, additional money, Gov equipment, etc): AF Weather Virtual Cloud (AFW VPC) Continuous Integration/Continuous Delivery (CI/CD) tools and processes can be utilized for software development and software deployment. The AFW VPC also hosts a MLops platform that can be utilized for data curation, experimentation, model training, and model metrics. The Weather Engineering Facility at Hanscom AFB, MA, hosts all types of AF Weather ground-based sensors and can be leveraged for systems engineering and machine learning algorithm employment evaluation processes. The government will supply additional supporting data, if available, if requested. Air Force Weather develops, tests, fields, modernizes, and sustains fixed and deployed ground-based weather sensor systems at locations around the world. Recent and near-term upgrades to tactical and fixed-base sensors include added digital sky cameras, higher resolution in-situ sensors, and data aggregation in a cloud-based platform. As an exploitation gap example, the digital sky cameras are currently only exploited by manual, human-visual processes and informally. Employing machine learning to build upon and create new weather sensor algorithms has great potential to provide additional and/or streamline local area environmental intelligence and to increase fidelity of understanding environmental impacts to operations in planning and execution. This intelligence is cumulative and adds to a global understanding of the environment, including accuracy/fidelity of regional and global weather physics and machine learning-based models. PHASE I: This topic is slated to compete for a Direct-to-Phase-2 (D2P2) topic with no Phase I SBIR portion. Therefore, direct documentation and a feasibility demonstration of using Machine Learning to generate additional observational data based on weather sensors (such as using visual observation encoding techniques mentioned above) beyond current capabilities is paramount for consideration. Additionally employing Machine Learning to augment collection and fidelity of gathered data is desired. Develop a conceptual design and approach for using Machine Learning to exploit newer sensing capabilities and data. Deliverables for consideration include a report or presentation demonstrating the conceptual design, Machine Learning implementation and benefit to current weather observation techniques for Phase II consideration. PHASE II: Develop and demonstrate a proof-of-concept prototype system based on the preliminary research and designs presented for consideration. PHASE III DUAL USE APPLICATIONS: Operationalize the prototype for existing tactical and fixed-base site sensor data. REFERENCES: 1. Weather Machine Learning Platform (WxMLP): https://m.facebook.com/NextGenFed/photos/nextgen-was-selected-to-brief-at-the-recent-air-force-research-laboratory-afrl-a/4272594109440865/; 2. AF Weather Web Services: https://weather.af.mil/ KEYWORDS: weather; observation; observing; modeling; environment; data; generation; Machine Learning; ML; Artificial Intelligence; AI; ground-based; sensor; cloud-based; exploitation; digital; visibility; sky; camera
US Flag An Official Website of the United States Government