You are here

Automated Environmental and Biological Threat Identification System

Description:

TECHNOLOGY AREA(S): Info Systems, Bio Medical 

OBJECTIVE: Develop a handheld platform for real-time identification of a wide range of insect, plant, and reptile (e.g., snake) species that may be found in DoD areas of operation. PROPOSALS ACCEPTED: Phase I and DP2. Please see the 17.1 DoD Program Solicitation and the DARPA 17.1 Direct to Phase II Instructions for DP2 requirements and proposal instructions. 

DESCRIPTION: There is a DoD need to provide warfighters with tools to support identification of various environmental and biological threats that may be found in their areas of operation. Early identification of potentially harmful insects, plants, and reptiles in the surrounding environment could help to improve safety and maintain maximal physical health of DoD associated personnel. Among other things, such identification tools could help to stop or reduce disease outbreaks or infestations and direct warfighters away from poisonous plants such as the common Toxicodendron radicans (poison ivy). Complicating current efforts is the fact that the number of species of interest in a given area is immense, and even few experts have the necessary training to correctly identify all potential threats across varied geographic regions. There is no current lightweight, accessible, commonly-used solution to the identification problem. This SBIR topic will support the development of an automated visual recognition and identification system that will (1) provide the image processing capability necessary for characterization of user-submitted pictures of insects, plants, and reptiles, (2) correctly identify dangers, and (3) provide users with relevant and sufficient information to allow for informed decision-making. Importantly, the image processing algorithms and information databases will be contained on a small (e.g., thumb-drive sized) device that interfaces directly with smartphones or similarly ubiquitous technology with image-acquisition capabilities, so that identification can take place on-the-spot in environments without internet connectivity. In addition to remote capability, devices will also wirelessly link to a public, web-accessible database that will synchronize with all devices in-the-field (when connections are available). The development of a master database of field-imaged data will be annotated by subject matter experts who can provide necessary information. As the databases grow with contributions from users, machine learning techniques will be used to improve the identification capabilities. Work in this area will benefit from recent advances in machine learning, image processing, and visual bioinformatics that allow for rapid and automated insect, plant, and reptile identification. 

PHASE I: Create a small memory storage device that interfaces with common imaging-capable smartphones or similarly sized battlefield equipment, and an associated app that provides access to the device camera and read/write capability. Build the necessary infrastructure to support a web-accessible database of plant, insect, and reptile images. Ensure that the mobile app and storage device can synchronize with the master database. Demonstrate synchronizing capability by uploading/downloading images from/to various devices. Populate initial database with existing images from reputable university and museum collections. Create the necessary algorithms for image processing and demonstrate positive identification (>90% success rate) of plants, insects, and reptiles from additional high-quality complete photographs not in the original dataset. Establish what information would be relevant to users and provide in an easily distilled format. Phase I deliverables will include detailed designs of the memory storage device and a working prototype, app source code, algorithms, web-accessible curated database of plant, insect, and reptile images, and a final report that includes a detailed and clear description of the algorithms implemented, justifications for choices made with respect to user-relevant information, demonstration test data, and preliminary performance results. For topic SB171-002 ONLY, DARPA will accept proposals for work and cost up to $225,000 for Phase I. The preferred structure is a $175,000, 12-month base period and a $50,000, 4-month option period. Alternative structures may be accepted if sufficient rationale is provided. 

PHASE II: Expand the image library by compiling additional images taken from various locations. Create machine learning algorithms that improve the success rate to >95%. Demonstrate positive identification of the same sample species in a range of light conditions and backgrounds. Demonstrate successful identification using photographs of samples that are incomplete (e.g., insect samples with wings missing or legs broken). Characterize robustness of the device to various environmental conditions (e.g., heat, water). Required Phase II deliverables will include additional source code for machine learning algorithms and other software components added since Phase I, expanded database with geographic information included (if not previously), and a final report that includes description of any changes made to the database, demonstration of success rate improvement, and characterization of device performance in specified environmental conditions. Report should also include a discussion of the potential to expand the scope of the technology to cover fish, birds, and other wildlife that, due to their sensitivity to physical, chemical, and biological threats, may provide indicators to harmful environmental conditions. 

PHASE III: A successful mobile platform for real-time insect, plant, and reptile species identification has significant potential to transition rapidly to the commercial sector for use in DoD and industrial applications. Users in various environments and in a wide variety of roles”including Environmental Science/Engineering Officers in the field”stand to benefit from the support such a platform will provide in assessing potential environmental and biological risks. 

REFERENCES: 

1: Larios, N., Deng, H., Zhang, W., Sarpola, M., Yuen, J., Paasch, R., ... & Shapiro, L. G. (2008). Automated insect identification through concatenated histograms of local appearance features: feature vector generation and region detection for deformable objects. Machine Vision and Applications, 19(2)

2: Sarpola, M. J., Paasch, R. K., Mortensen, E. N., Dietterich, T. G., Lytle, D. A., Moldenke, A. R., & Shapiro, L. G. (2008). An aquatic insect imaging system to automate insect classification. Transactions of the ASABE, 51(6), 2217-2225.

3: Wang, J., Lin, C., Ji, L., & Liang, A. (2012). A new automatic identification system of insect images at the order level. Knowledge-Based Systems, 33, 102-110.

4: Yang, H. P., Ma, C. S., Wen, H., Zhan, Q. B., & Wang, X. L. (2015). A tool for developing an automatic insect identification system based on wing outlines. Scientific reports, 5.

5: Additional Topic Briefing, November 30, 2016 (uploaded in SITIS on 11/30/16).

 

KEYWORDS: Automated Visual Identification, Insect Identification, Plant Identification, Medical Entomology, Pattern Recognition, Machine Learning, Bioinformatics 

US Flag An Official Website of the United States Government