Welcome to the new SBIR.gov, to assist in getting you situated with the system, a preview of the new login and registration process is available here. Please reach out to the website support team with any questions via sba.sbir.support@reisystems.com

Topic

Funding Opportunities

Icon: back arrowBack to Funding Opportunities Search

Computational Analysis of Event Camera Imagery for Propellant Testing

Seal of the Agency: DOD

Funding Agency

DOD

ARMY

Year: 2025

Topic Number: A254-003

Solicitation Number: 25.4

Tagged as:

SBIR

BOTH

Solicitation Status: Open

NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.

View Official Solicitation

Release Schedule

  1. Release Date
    November 6, 2024

  2. Open Date
    October 2, 2024

  3. Due Date(s)

  4. Close Date
    November 20, 2024

Description

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Computing and Software; Advanced Materials OBJECTIVE: The objective of this effort is to develop a hardware and software solution to capture, process, and measure key parameters of the chaotic, high-speed, propellant development experiments in challenging real world lighting conditions. It is important because it will lead to advancements in propelling charge design that will enable greater range of artillery weapon systems in development and simultaneously increasing both the lethality and the safety of Soldiers utilizing propellants designed with this technology. DESCRIPTION: • Develop new event camera hardware/software system that can capture and display propellant testing data at a rate of 0.01ms or faster over a 0.5 second interval in high dynamic range lighting conditions (100dB or more). • Develop software that can fuse event camera data with high-speed camera data and utilize supervised and unsupervised AI/ML tools to perform complex 3D image analysis to assess and evaluate initial ignition phenomenon and combustion propagation in a translucent acrylic combustion case. • Develop software with Graphic User Interface that can visualize and generate reports of assessment and evaluation of propellant testing. PHASE I: This topic is only accepting Direct to Phase II proposals for a cost up to $2,000,000 for a 24-month period of performance. Proposers interested in submitting a DP2 proposal must provide documentation to substantiate that the scientific and technical merit and feasibility equivalent to a Phase I project has been met. Documentation can include data, reports, specific measurements, success criteria of a prototype, etc. (DIRECT TO) PHASE II: 1. Demonstrate the use of Event Camera(s) to capture imagery of a rapidly changing chaotic event and the production of metrics that supports the analysis of the event. Examples of Event Camera data collection for events occurring in high dynamic range lighting conditions are highly desirable. 2. Demonstrate the use of High-Speed Video with sampling rates greater than or equal to 5,000 (Threshold)/ 10,000 (Objective) frames per second to capture imagery of a rapidly changing event and the production of metrics that support the analysis of the event. Examples of tools developed for efficient and effective processing of High-Speed video to generate metrics for analysis is highly desired. 3. Demonstrate the fusing of different types of sensor technologies e.g., Electro-Optical and Infrared sensors, to utilize the strengths of each sensor technologies to produce an enhanced data product not solely attainable from a single sensor type. 4. Demonstrate the use of AI/ML processes to effectively and efficiently extract data from video-based imagery. Examples must include the development of the AI/ML tools, efficiencies gained, and process flow that a qualified user with similar resources would need to follow to obtain similar results. PHASE III DUAL USE APPLICATIONS: • Autonomous Vehicles/Drones & Industrial: self-driving cars and commercial drone surveillance • Scientific Research: studying high-speed phenomena in fields like fluid dynamics or particle physics along with analyzing rapid movements in humans (biomechanics) for scientific study • Medical Imaging & Consumer/Mobile Electronics: utilize low-light and high dynamic range imaging to improve the quality of images in minimally invasive procedures or through HDR technology on personal mobile devices • Robotics & AR/VR: 3D object detection and tracking REFERENCES: 1. “Data-driven Feature Tracking for Event Cameras”, Nico Messikommer et al, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, 2023 https://arxiv.org/abs/2211.12826 2. “Low-latency automotive vision with event cameras”, Daniel Gehrig et al, Nature May 2024, https://www.nature.com/articles/s41586-024-07409-w 3. “High speed and high dynamic range video with an Event Camera”, H Rebecq et al, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2019: https://rpg.ifi.uzh.ch/docs/TPAMI19_Rebecq.pdf KEYWORDS: Event Camera; Artificial Intelligence; Sensor Fusion