You are here

Unmanned Underwater Vehicle (UUV) Sensor Data Transformation Tool


OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy OBJECTIVE: Develop a software tool to transform and create synthetic sensor data from information received by a different sensor. DESCRIPTION: Modern platforms may contain a variety of sensors such as sonars of varying frequencies, cameras, and magnetic sensors. Often times, target information is collected by one or more sensors, but not by all of the sensors. This can lead to an increased number of test runs against targets to ensure each sensor, and associated algorithms, have an opportunity to collect target information in a variety of environmental conditions. The Navy seeks an innovative tool to transform sensor and metadata from a given system or frequency into realistic synthetic data from another sensor. For example, the transformation tool should be capable of transforming real data collected by a side scan sonar (SSS) into synthetic data representative of other sensor modalities, such as an SSS of a different frequency, a forward-looking sonar (FLS), or an electro-optical camera. In addition, this tool would be able to reconstruct a synthetic target in different orientations, with varied degrees of burial, and with adjacent imagery in varying bottom types (e.g., complex, noncomplex). The information created with this synthetic data generation tool will be used to develop and train automatic target recognition (ATR) algorithms. Sensor data to be generated should use complex physics-based models and represent objects in the subsea environments including mine-like objects. Providing realistic synthetic data will improve ATR, operator responses, reduce operator uncertainty, and improve decision-making. Machine Learning (ML) synthesis tools can enable development of realistic synthetic sonar for use with simulations. ML approaches are being leveraged for image and video processing applications, but a limiting factor is the availability of training data. High-quality synthesis approaches that utilize ML can also provide an alternate means to creating the large volumes of training data that are needed to ‘teach’ a deep learning algorithm. Sensors of interest include acoustic, optical, and magnetic sensors. Solutions must also be compatible with, and easily applied within Navy Expeditionary user displays and interfaces for conducting in-situ mission monitoring and post mission analysis. The proposer will analyze sensors and data formats, and develop data transformation solutions capable of incorporation onto Nvidia Graphic Processing Units (GPUs), ensuring compatibility with user interfaces employed in legacy Navy Expeditionary UUV systems. Synthetically generated images and data should be quantifiably similar to real data produced by the target sensor in terms of acoustic and optical reflectivity, and magnetic moment. That is, synthetic data scored as 'similar' should have ATR outcomes representative of real sensor data. ATR performance will be measured in Phase II. Methodologies and metrics for similarity scoring are encouraged as components of validity test proposals. PHASE I: Develop a concept for an innovative software solution capable of generating synthetic sensory data and metadata. During the Phase I base effort, the Government will provide a list of commercial sensor type, representative calibration targets and display/meta-data for sensors of interest to enable analysis of data structures and determination of data transformation feasibility and limits. Demonstrate the feasibility of the concept to successfully confirm that data from a real data set is transformed into synthetic data as if collected by other sensor modalities to include different sonar frequencies and types (e.g., FLS, Gap-filling sonar (GFS), Real Aperture Sonar (RAS), and Synthetic Aperture Sonar (SAS)), Magnetometer, and Optical Sensors. The Phase I Option, if exercised, will include initial design specifications and a capabilities description to build a prototype in Phase II. PHASE II: Develop, demonstrate, and deliver a software prototype system capable of creating synthetic sensor data and metadata from sensors identified in Phase I efforts for testing and evaluation; the prototype system will be compatible with and suited to future integration as a module of the Common Operator Interface, Navy (COIN)/NEXUS user interface. Develop the prototype sensor transformation tool. Synthetic data generated using this tool will be evaluated against baseline ATR algorithms it to determine if it meets Navy performance goals described in the Phase II SOW. Use operationally representative data for the demonstration. Identify performance and technical requirements to be met during evaluation. Prepare a Phase III development plan to transition the technology for Navy and other potential commercial use. In Phase II, develop and demonstrate performance of a prototype software module, incorporating their technical solution for synthetic data generation for sonar, optical and magnetic moment data. The transformed data set build time threshold is 3:1 with an objective of 24 hours or less, for distribution to fleet operators or other programs. PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the successfully matured technology as a software module of the Common Operator Interface, Navy (COIN)/NEXUS user interface as a component of Navy Expeditionary UUV systems. Technical support to further troubleshoot, further refine and better adapt the Phase II prototype deliverable will be provided by the contractor as the Navy conducts the full range of testing and evaluation of the module as a software upgrade to the UUV systems. Navy testing and evaluation will be included comparative analysis of real-world data sets collected by the Navy where actual representative targets have been added, with synthetically generated data sets which add the targets to the environment. ATR algorithms will be run on both data sets to verify effectiveness of the synthetic data generation tool. Refinements by the contractor during Phase III may include, but are not limited to, software certification for cyber security compliance, and an addition or improvement of features and attributes to enhance user interfaces. Application of the product may reasonably be expected to extend to commercial contexts such as automated object recognition for autonomous underwater vehicles (AUVs) and remotely operated vehicles (ROVs) engaged in maritime salvage and field inspection for the oil and gas industry. REFERENCES: 1. Brad Walls (Mar 23 2021). Using the Unreal Engine as a High Fidelity Simulation for Data Creation, The Fifth Annual Workshop on Naval Applications of Machine Learning, Underline Science Inc. 2. Ilisescu, Corneliu et al. “Responsive Action-based Video Synthesis.” Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, May 06-11, 2017, pp. 6569 6580. 3. Wang, Ting-Chun et al. “Video to Video Synthesis.” NIPS Proceedings, 2018. 4. You, Xinge et al. “Kernel Learning for Dynamic Texture Synthesis.” IEEE Transactions on Image Processing, Vol.. 25, No. 10, OCTOBER 2016. Kernel-Learning-for-Dynamic-Texture-Synthesis.pdf 5. Dosovitskiy, Alexey and Brox, T. “Generating images with perceptual similarity metrics based on deep networks.” NIPS Proceedings, 2016. KEYWORDS: Mine Countermeasures; MCM; Synthetic Data; Software; Unmanned Undersea Vehicles; Mines; Navy Expeditionary; Collection of target information on UUVs
US Flag An Official Website of the United States Government