You are here

Combined Electro-Optics/Infrared and Radar Sensor System for Detect and Avoid of Non-Cooperative Traffic for Small Unmanned Aerial Systems


RT&L FOCUS AREA(S): General Warfighting Requirements

TECHNOLOGY AREA(S): Air Platforms; Electronics

OBJECTIVE: Develop dual-sensor, electro-optics/infrared (EO/IR) and radar, non-cooperative, traffic sensor concepts that will provide sufficient performance and balanced size, weight, power, and cost (SWaP-C) for small unmanned aerial systems (sUAS) where sufficient performance is unachievable by any single-sensor concept.

DESCRIPTION: DAA cooperative sensors that have been developed for manned aircraft, for example, Traffic Collision Avoidance System (TCAS) and Automatic Dependent Surveillance-Broadcast (ADS-B), are nondevelopmental and off the shelf. Detect and avoid (DAA) non-cooperative sensor subsystems, which take the place of a pilot’s eyes, are a new construct whose role and employment has not been previously defined. Airborne Collision Avoidance System Xu (ACAS Xu) is a new DAA technology being developed by the Federal Aviation Administration (FAA) that processes inputs from both cooperative and non-cooperative sensors and provides alerts to the UAS operator to Remain Well Clear (RWC), and in the future will provide automatic maneuvers. Radar is the only current sensor actively being procured by the Navy as a non-cooperative DAA sensor with Radio Technical Commission for Aeronautics (RTCA) Do-366 addressing radar’s Minimum Operational Performance Standard (MOPS) in the National Air Space (NAS). No other non-cooperative sensor has a MOPS. The radar development and production costs are high and dependent on its assigned role and the associated performance requirements. As such, a complete assessment of SWaP-C must be included in the establishment of safety requirements. EO/IR sensors are a desired alternative due to potentially lower SWaP-C. They are currently being considered for non-cooperative traffic surveillance as a part of RTCA Special Committee 228; however, they have performance challenges in low-visibility conditions and difficulty estimating range and range rate measurements that are essential for projecting Closest Point of Approach (CPA) and Time of CPA (TCPA). There is interest by civilian authorities (e.g. Federal Aviation Administration)and by the Navy for a dual sensor EO/IR and radar non-cooperative traffic sensor that will provide sufficient performance, but with less SWaP-C. A camera alone is not sufficient nor suitable for integration with ACAS Xu due to these shortcomings, and a radar, capable of doing the job, would not fit on board. A lower performing radar, providing suitable range and bearing information, to be combined with an EO/IR sensor, to meet the stringent SWaP-C limitations of sUAS is desired. All airborne hardware should weigh less than 3 lbs (1.36 kg) (Threshold) and 12 oz (340.2 g) (Objective); and consume less than 64 in.³ (0.00105 m³) Threshold) and 27 in.³ (0.000442451 m³) (Objective) of total space, with a power draw of less than 50 W average (Threshold) and 25 W average (Objective).

Critical evaluation criteria include the ability to provide sufficient tracking range and accuracy in order for an RQ-7 Shadow or RQ-21 Blackjack to avoid midair collisions and near midair collisions with other aircraft such as a Lancair Evolution, Cessna TTx, or Cessna 150. In general, radars provide highly accurate range and range rate information, but their angular resolution is inferior to EO/IR sensors. A dual-sensor system approach for sUAS must operate in lower altitude (<10,000 ft), overland environments, which present challenges for radar systems as slow-speed traffic may not separate well from the clutter and sources of false alarms. Likewise, performance of EO/IR systems suffer their own false-alarm problems, and performance is highly dependent on atmospheric conditions. An effective dual-sensor system must be able to detect and track targets in a range of atmospheric conditions, manage false alarms and clutter effects, and provide high enough accuracy to predict and avoid collisions. Such a system must consider multisensor data fusion approaches, multiband imaging system for all-weather operations, algorithms for mitigating false alarms and enhancing detection, sensor resource management (SRM) and feature-aided target characterization and tracking.

PHASE I: Design, develop, and demonstrate feasibility of dual-sensor detection, tracking, and false-alarm mitigation algorithms for expected operational environments and conditions. The Phase I effort will include prototype plans to be developed under Phase II.

PHASE II: Based on Phase I results, candidate concept(s) will be matured through more detailed, high-fidelity analyses and the development of dual-sensor detection, tracking, and false-alarm mitigation algorithms for expected operational environments and conditions. Examine sensor-integration concepts suitable for candidate sUAS. Assess hardware, software, and firmware impacts to accommodate the dual-sensor system, onboard candidate, sUAS. Identify critical technical challenges, perform necessary analysis, and as required, experimentation to understand the associated risk. The Phase II deliverable must provide a dual-sensor concept of sufficient detail to support the fabrication of a prototype demonstrator system.

PHASE III DUAL USE APPLICATIONS: Complete development, perform final testing, integrate, and transition the final solution to Navy airborne platforms. The dual sensor system is suitable for use on commercial small unmanned aircraft.


  1. Geyer, C.; Singh, S. and Chamberlain, L. ” Avoiding collisions between aircraft: State of the art and requirements for UAVs operating in civilian airspace.” Carnegie Mellon University, January 2008.  
  2. van der Horst, R. and Hogema, J. “Time-to-Collision and Collision Avoidance Systems” Proceedings of the 6th Workshop of ICTCT “Pedestrian Problems”. Prague, Czech Republic, October 26-28, 1994.
  3. Lai, J.; Ford, J.J., Mejias, L. and O’Shea, P. “Characterization of sky-region morphological-temporal airborne collision detection.” Journal of Field Robotics, Volume 30, Issue 2, March 2013, pp. 171-193.
  4. Hottel, H.C. “A simple model for estimating the transmittance of direct solar radiation through clear atmospheres.” Solar Energy, Volume 18, Issue 2, 1976, pp. 129-134.
  5. Wang, K.; Dickinson, R. E. and Liang, S. “Clear sky visibility has decreased over land globally from 1973 to 2007.” Science, Volume 323, Issue 5920, March 13, 2009, pp. 1468-1470.
  6. Chen, C.C. “R-1694-PR Attenuation of electromagnetic radiation by haze, fog, clouds, and rain.” Rand, April 1975.  
  7. Jaruwatanadilok, S.; Ishimaru, A. and Kuga, Y. “Optical imaging through clouds and fog.” IEEE Transactions on Geoscience and Remote Sensing, Volume 41, Issue 8, August 18, 2003, pp. 1834-1843.
  8. Krapels, K.A.; Driggers, R.G.; Vollmerhausen, R.H.; Kopeika, N.S. and Halford, C.E. “Atmospheric turbulence modulation transfer function for infrared target acquisition modeling.” Optical Engineering, Volume 40, Issue 9. September 2001.
  9. Kokhanovsky, A. “Optical properties of terrestrial clouds”. Earth-Science Reviews, Volume 64, Issue 3, February 2004, pp. 189-241.
  10. Pizer, S.M.; Amburn, E.P.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Greer, T.; ter Haar Romeny, B.; Zimmerman, J.B. and Zuiderveld, K. “Adaptive histogram equalization and its variations.” Computer Vision, Graphics, and Image Processing, Volume 39, Issue 3, September 1987, pp. 355-368.
  11. Narasimhan, S.G. and Nayar, S.K. “Vision and the Atmosphere.” International Journal of Computer Vision, Volume 48, Issue 3, July 2003, pp. 233-254.
  12. Schechner, Y.Y.; Narasimhan, S.G. and Nayar, S.K. “Polarization-based vision through haze.” Applied optics, Volume 42, Issue 3, 2003, pp. 511-525.
  13. Casasent, D. and Ye, A. “Detection filters and algorithm fusion for ATR.” IEEE transactions on image processing: a publication of the IEEE Signal Processing Society, Volume 6, Issue 1, 1997, pp. 114-125.
  14. Fasano, G.; Accardo, D.; Tirri, A.E.; Moccia, A. and De Lellis, V. “Morphological filtering and target tracking for vision-based UAS sense and avoid.” Conference session in Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, May 27-20, 2014, pp. 430-440.
  15. Qi, S.; Ma, J., Tao, C.; Yang, C. and Tian, V. “A robust directional saliency-based method for infrared small-target detection under various complex backgrounds.” IEEE Geoscience and Remote Sensing Letters, Volume 10, Issue 3, May 2013, pp. 495-499.
  16. Wan, M.; Gu, G.l Cao, E.; Hu, X.; Qian, W. and Ren, K. “In-frame and inter-frame information based infrared moving small target detection under complex cloud backgrounds." Infrared Physics & Technology, Volume 76, pp. 455-467.
  17. Hu, S.; Goldman, G.H. and Borel-Donohue, C.C. “Detection of unmanned aerial vehicles using a visible camera system.” Applied Optics, Volume 56, Issue 3, 2017, pp. B214–B221.
  18. Cheraghi, S.A. and Sheikh, U.U. “Moving object detection using image registration for a moving camera platform.” Paper presentation in 2012 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, November 23-25, 2012.
  19. Lef├ębure, M. and Cohen, L.D. “Image Registration, Optical Flow and Local Rigidity.” Journal of Mathematical Imaging and Vision, Volume 14, Issue 2, March 2001, pp. 131-147.
  20. Farneback, G. “Very high accuracy velocity estimation using orientation tensors, parametric motion, and simultaneous segmentation of the motion field.” Poster presentation in Proceedings of the Eighth IEEE International Conference on Computer Vision, Vancouver, British Columbia, Canada, Volume 1, July 7-14, 2001, pp. 171-177.  
  21. Kennedy, H. L. (“Multidimensional digital filters for point-target detection in cluttered infrared scenes.” Journal of Electronic Imaging, Volume 23, Issue 6, 063019, December 17, 2014.
  22. Carnie, R.; Walker, R. and Corke, P. “Image processing algorithms for UAV “sense and avoid”.” Conference session in Proceedings of the 2006 IEEE International Conference on Robotics and Automation (ICRA) 2006, Orlando, FL.
  23. Jackson, J.A.; Boskovic, J. and Diel, D. “System-level Performance Analysis of a Closed-loop Sense and Avoid System for Unmanned Vehicles in the NAS.” AIAA Information Systems-AIAA Infotech @ Aerospace, Grapevine, TX, January 9-13, 2017.
  24. Mahler, R. “The multisensor PHD filter: I. General solution via multitarget calculus.” Signal Processing, Sensor Fusion, and Target Recognition XVIII, 7336. SPIE Defense, Security, and Sensing, Orlando, FL, May 11, 2009.  
  25. Ristic, B.; Clark, D. and Vo, B-N. “Improved SMC implementation of the PHD filter.” 2010 13th International Conference on Information Fusion, 1–8, Edinburgh, United Kingdom.
  26. Reuter, S.; Vo, B-T.; Vo, B-N. and Dietmayer, K. “The Labeled Multi-Bernoulli Filter.” IEEE Transactions on Signal Processing, Volume 62, Issue 12, May 14, 2014, pp. 324603260.
  27. Dey, D., Geyer, C., Singh, S., & Digioia, M. (2010). Passive, long-range detection of aircraft: Towards a field deployable sense and avoid system. In A. Howard, & K. Iagnemma (Eds.) Field and Service Robotics. Springer Tracts in Advanced Robotics (Vol. 62). Springer.  
  28. Weinert, A.; Harkleroad, E.P.; Griffith, J.D.; Edwards, M. W. and Kochenderfer, M.J. “Uncorrelated Encounter Model of the National Airspace System, Version 2.0 (Report number ATC-404).” Massachusetts Institute of Technology Lincoln Laboratory, August 19, 2013.
US Flag An Official Website of the United States Government