You are here

Standoff Detection of Hidden Objects and Personnel In and Around Foliage

Description:

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Advanced Computing and Software, Integrated Sensing and Cyber, Integrated Network Systems-of-Systems, Human-Machine Interfaces

 

OBJECTIVE: Autonomous standoff detection of hidden objects and personnel in or around foliage at 50 feet to 250 feet.

 

DESCRIPTION: This Topic seeks technology capabilities to autonomously detect hidden objects and personnel in and around foliage and roadsides at standoff distances from 50 to 250 feet and provide a warning.    Current commercial screening technologies include millimeter wave, terahertz sensors, magnetometers, x-rays, and in some cases neutron scattering.  These technologies are effective at detecting target objects but are designed for near field detection (inches to feet).  There is interest in detecting and tracking target objects at standoff distances of 50 to 250 feet for "agile node" such as expeditionary airfields, survivable command and control, agile support. The purpose of autonomy is to facilitate maneuver, enhance force protection, reduce cognitive and training burden on the operators.    Autonomous detection and alarm reduce cognitive burden on operators by preventing screen fatigue and highlighting suspicious objects in a scene.  Autonomous software can reduce training demands by supporting and assisting the operator during system start up and operation and suggest courses of action in response to a given alarm.  Autonomy enables the operator to be located at a distance greater than 300 to 450 feet (the operator does not have to stay next to the sensor to see information and alarm) enhancing Force Protection.   This Topic call does not include leave behind components such as point and vibration sensors.

 

PHASE I: Demonstrate detection of varying sizes and shapes of metal, plastics that are approximately the size of soup cans, gallon paint cans, small manhole covers, and personnel from distances of 50 feet, 100 feet, 150 feet, 200 feet, and 250 feet from a starting point on the ground representing sensor position to the target.   If the sensor is positioned 30 feet in the air or on a post, as an example only, drop a line to the ground for the “starting point”.  The objects and personnel should be placed in and around different types of foliage in spring, summer, fall brush, roadside brush, trees.  Collect sufficient target data to develop and demonstrate feasibility for target object detection, classification, tracking using machine learning, artificial intelligence, signal processing innovations.  Develop and deliver a sensor design that can be used to build a Phase II experimental prototype sensor that can be operated in a field experiment by a Government scientist, engineer, and Soldier.   False alarms should be considered.  Develop an approach that can be used to characterize system performance for detection and false alarms.  An example would be to, but not limited to (NLT), develop a randomized or semi-randomized experimental design and test matrix that can be conducted within the budget boundaries of Phase I that can be used to provide data sufficient for preliminary limited receiver operator characteristics (ROC) curve(s) to demonstrate feasibility of sensor design concept. The purpose is to start thinking about false alarm states and mitigation. The Phase I deliverable should include both the sensor design and experimental data that supports the design and mitigates the Phase II risk.  Offerings of market surveys and later down selection will be considered non-responsive.

 

PHASE II: Build and demonstrate a smart prototype sensor based on the design and algorithms developed in Phase I that can be operated by Government Scientists, Engineers, and Soldiers for the purpose of participating in an Army Expeditionary Warrior Experiment (AEWE) or equivalent user experiment.  Collected target data in sufficient quantity to develop and demonstrate machine learning and artificial intelligence to scan, detect, classify, locate, and track target objects and personnel such that receiver operator characteristic curves (ROC) or similar statistical analysis can be developed to characterize system performance.  The Phase II smart sensor should issue a visual alarm on a screen that an operator can see.  The screen may be either a monitor screen attached to the sensor or a remote screen; one example would be, but not limited to, a cell phone.  The prototype should demonstrate covert autonomous standoff detection from an agile node at 50 feet to 250 feet of a variety of metal shapes and personnel in and around foliage.  Examples of an agile node might be expeditionary airfields, survivable command and control, or covert agile support.  The prototype should demonstrate preliminary feasibility for operation on the move from a moving vehicle traveling 1 to 20 miles per hour.  Using multiple sensors to scan surrounding area is acceptable.  Innovations in machine learning and artificial intelligence may be used to scan, detect, classify, locate, and track target objects and personnel.  The Phase II deliverable should be a prototype demonstration in contractor’s facilities and a Warfighter experiment such as an Army Expeditionary Warrior Experiment (AEWE) or equivalent.  The Phase II prototype sensor should be delivered “in place” to the Government.  “In place” means that the prototype delivery will be in the Contractor’s facility, but accessible for future work by the Government.

 

PHASE III DUAL USE APPLICATIONS: Further research and development during Phase III efforts will be directed toward refining the final deployable equipment and procedures.  Design modifications based on results from tests conducted during Phase III will be incorporated into the system.  Manufacturability specific to Counter Improvised Explosives Devices (C-IED) Program Concept of Operations (CONOPS) and end-user requirements will be examined.

 

REFERENCES:

  1. David A. Andrews, Stuart William Harmer, Nicholas J. Bowring, Nacer D. Rezgui, and Matthew J. Southgate. "Active millimeter wave sensor for standoff concealed threat detection." IEEE Sensors journal 13, no. 12 (2013): 4948-4954.;
  2. Zhongmin Wang, Tianying Chang, and Hong-Liang Cui. "Review of active millimeter wave imaging techniques for personnel security screening." IEEE Access 7 (2019): 148336-148350.;
  3. Boris Kapilevich, and Moshe Einat. "Detecting hidden objects on human body using active millimeter wave sensor." IEEE Sensors journal 10, no. 11 (2010): 1746-1752.;
  4. Federico García-Rial, Daniel Montesano, Ignacio Gómez, Carlos Callejero, Francis Bazus, and Jesús Grajal. "Combining commercially available active and passive sensors into a millimeter-wave imager for concealed weapon detection." IEEE Transactions on Microwave Theory and Techniques 67, no. 3 (2018): 1167-1183;
  5. Bram van Berlo, Amany Elkelany, Tanir Ozcelebi, and Nirvana Meratnia. "Millimeter wave sensing: A review of application pipelines and building blocks." IEEE Sensors Journal 21, no. 9 (2021): 10332-10368;
  6. Roger Appleby, Duncan A. Robertson, and David Wikner. "Millimeter wave imaging: a historical review." In Passive and Active Millimeter-Wave Imaging XX, vol. 10189, p. 1018902. SPIE, 2017;
  7. Ting Liu, Yao Zhao, Yunchao Wei, Yufeng Zhao, and Shikui Wei. "Concealed object detection for activate millimeter wave image." IEEE Transactions on Industrial Electronics 66, no. 12 (2019): 9909-9917.;
  8. Jeffrey A. Nanzer, Microwave and millimeter-wave remote sensing for security applications. Artech House, 2012; Boris Y. Kapilevich, Stuart W. Harmer, and Nicholas J. Bowring. Non-imaging microwave and millimetre-wave sensors for concealed object detection. CRC Press, 2014;
  9. A. Huizing, M. Heiligers, B. Dekker, J. de Wit, L. Cifola and R. Harmanny, "Deep Learning for Classification of Mini-UAVs Using Micro-Doppler Spectrograms in Cognitive Radar," in IEEE Aerospace and Electronic Systems Magazine, vol. 34, no. 11, pp. 46-56, 1 Nov. 2019, doi: 10.1109/MAES.2019.2933972.;
  10. Abhishek Gupta, Alagan Anpalagan, Ling Guan, Ahmed Shaharyar Khwaja, “Deep learning for object detection and scene perception in self-driving cars: Survey, challenges, and open issues”, Array, Volume

 

KEYWORDS: standoff detection 20-250 feet,  open unstructured environment, moving targets, millimeter wave, autonomous identification and tracking  hidden threats and personnel

US Flag An Official Website of the United States Government