You are here

Combined Electro-Optics/Infrared and Radar Sensor System for Detect and Avoid of Non-Cooperative Traffic for Small Unmanned Aerial Systems

Description:

RT&L FOCUS AREA(S): General Warfighting Requirements

TECHNOLOGY AREA(S): Air Platforms; Electronics

OBJECTIVE: Develop dual-sensor, electro-optics/infrared (EO/IR) and radar, non-cooperative, traffic sensor concepts that will provide sufficient performance and balanced size, weight, power, and cost (SWaP-C) for small unmanned aerial systems (sUAS) where sufficient performance is unachievable by any single-sensor concept.

DESCRIPTION: DAA cooperative sensors that have been developed for manned aircraft, for example, Traffic Collision Avoidance System (TCAS) and Automatic Dependent Surveillance-Broadcast (ADS-B), are nondevelopmental and off the shelf. Detect and avoid (DAA) non-cooperative sensor subsystems, which take the place of a pilot’s eyes, are a new construct whose role and employment has not been previously defined. Airborne Collision Avoidance System Xu (ACAS Xu) is a new DAA technology being developed by the Federal Aviation Administration (FAA) that processes inputs from both cooperative and non-cooperative sensors and provides alerts to the UAS operator to Remain Well Clear (RWC), and in the future will provide automatic maneuvers. Radar is the only current sensor actively being procured by the Navy as a non-cooperative DAA sensor with Radio Technical Commission for Aeronautics (RTCA) Do-366 addressing radar’s Minimum Operational Performance Standard (MOPS) in the National Air Space (NAS). No other non-cooperative sensor has a MOPS. The radar development and production costs are high and dependent on its assigned role and the associated performance requirements. As such, a complete assessment of SWaP-C must be included in the establishment of safety requirements. EO/IR sensors are a desired alternative due to potentially lower SWaP-C. They are currently being considered for non-cooperative traffic surveillance as a part of RTCA Special Committee 228; however, they have performance challenges in low-visibility conditions and difficulty estimating range and range rate measurements that are essential for projecting Closest Point of Approach (CPA) and Time of CPA (TCPA). There is interest by civilian authorities (e.g. Federal Aviation Administration)and by the Navy for a dual sensor EO/IR and radar non-cooperative traffic sensor that will provide sufficient performance, but with less SWaP-C. A camera alone is not sufficient nor suitable for integration with ACAS Xu due to these shortcomings, and a radar, capable of doing the job, would not fit on board. A lower performing radar, providing suitable range and bearing information, to be combined with an EO/IR sensor, to meet the stringent SWaP-C limitations of sUAS is desired. All airborne hardware should weigh less than 3 lbs (1.36 kg) (Threshold) and 12 oz (340.2 g) (Objective); and consume less than 64 in.³ (0.00105 m³) Threshold) and 27 in.³ (0.000442451 m³) (Objective) of total space, with a power draw of less than 50 W average (Threshold) and 25 W average (Objective).

Critical evaluation criteria include the ability to provide sufficient tracking range and accuracy in order for an RQ-7 Shadow or RQ-21 Blackjack to avoid midair collisions and near midair collisions with other aircraft such as a Lancair Evolution, Cessna TTx, or Cessna 150. In general, radars provide highly accurate range and range rate information, but their angular resolution is inferior to EO/IR sensors. A dual-sensor system approach for sUAS must operate in lower altitude (<10,000 ft), overland environments, which present challenges for radar systems as slow-speed traffic may not separate well from the clutter and sources of false alarms. Likewise, performance of EO/IR systems suffer their own false-alarm problems, and performance is highly dependent on atmospheric conditions. An effective dual-sensor system must be able to detect and track targets in a range of atmospheric conditions, manage false alarms and clutter effects, and provide high enough accuracy to predict and avoid collisions. Such a system must consider multisensor data fusion approaches, multiband imaging system for all-weather operations, algorithms for mitigating false alarms and enhancing detection, sensor resource management (SRM) and feature-aided target characterization and tracking.

PHASE I: Design, develop, and demonstrate feasibility of dual-sensor detection, tracking, and false-alarm mitigation algorithms for expected operational environments and conditions. The Phase I effort will include prototype plans to be developed under Phase II.

PHASE II: Based on Phase I results, candidate concept(s) will be matured through more detailed, high-fidelity analyses and the development of dual-sensor detection, tracking, and false-alarm mitigation algorithms for expected operational environments and conditions. Examine sensor-integration concepts suitable for candidate sUAS. Assess hardware, software, and firmware impacts to accommodate the dual-sensor system, onboard candidate, sUAS. Identify critical technical challenges, perform necessary analysis, and as required, experimentation to understand the associated risk. The Phase II deliverable must provide a dual-sensor concept of sufficient detail to support the fabrication of a prototype demonstrator system.

PHASE III DUAL USE APPLICATIONS: Complete development, perform final testing, integrate, and transition the final solution to Navy airborne platforms. The dual sensor system is suitable for use on commercial small unmanned aircraft.

REFERENCES:

  1. Geyer, C.; Singh, S. and Chamberlain, L. ” Avoiding collisions between aircraft: State of the art and requirements for UAVs operating in civilian airspace.” Carnegie Mellon University, January 2008. https://www.ri.cmu.edu/pub_files/2008/3/CMU-RI-TR-08-03.pdf  
  2. van der Horst, R. and Hogema, J. “Time-to-Collision and Collision Avoidance Systems” Proceedings of the 6th Workshop of ICTCT “Pedestrian Problems”. Prague, Czech Republic, October 26-28, 1994. https://www.ictct.net/wp-content/uploads/07-Prague-1994/07-Proceedings.pdf
  3. Lai, J.; Ford, J.J., Mejias, L. and O’Shea, P. “Characterization of sky-region morphological-temporal airborne collision detection.” Journal of Field Robotics, Volume 30, Issue 2, March 2013, pp. 171-193. https://doi.org/10.1002/rob.21443
  4. Hottel, H.C. “A simple model for estimating the transmittance of direct solar radiation through clear atmospheres.” Solar Energy, Volume 18, Issue 2, 1976, pp. 129-134. https://doi.org/10.1016/0038-092X(76)90045-1
  5. Wang, K.; Dickinson, R. E. and Liang, S. “Clear sky visibility has decreased over land globally from 1973 to 2007.” Science, Volume 323, Issue 5920, March 13, 2009, pp. 1468-1470. https://doi.org/10.1126/science.1167549
  6. Chen, C.C. “R-1694-PR Attenuation of electromagnetic radiation by haze, fog, clouds, and rain.” Rand, April 1975. https://www.rand.org/content/dam/rand/pubs/reports/2006/R1694.pdf  
  7. Jaruwatanadilok, S.; Ishimaru, A. and Kuga, Y. “Optical imaging through clouds and fog.” IEEE Transactions on Geoscience and Remote Sensing, Volume 41, Issue 8, August 18, 2003, pp. 1834-1843. https://doi.org/10.1109/TGRS.2003.813845
  8. Krapels, K.A.; Driggers, R.G.; Vollmerhausen, R.H.; Kopeika, N.S. and Halford, C.E. “Atmospheric turbulence modulation transfer function for infrared target acquisition modeling.” Optical Engineering, Volume 40, Issue 9. September 2001. https://doi.org/10.1117/1.1390299
  9. Kokhanovsky, A. “Optical properties of terrestrial clouds”. Earth-Science Reviews, Volume 64, Issue 3, February 2004, pp. 189-241. https://doi.org/10.1016/S0012-8252(03)00042-4
  10. Pizer, S.M.; Amburn, E.P.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Greer, T.; ter Haar Romeny, B.; Zimmerman, J.B. and Zuiderveld, K. “Adaptive histogram equalization and its variations.” Computer Vision, Graphics, and Image Processing, Volume 39, Issue 3, September 1987, pp. 355-368. https://doi.org/10.1016/S0734-189X(87)80186-X
  11. Narasimhan, S.G. and Nayar, S.K. “Vision and the Atmosphere.” International Journal of Computer Vision, Volume 48, Issue 3, July 2003, pp. 233-254. https://doi.org/10.1023/A:1016328200723
  12. Schechner, Y.Y.; Narasimhan, S.G. and Nayar, S.K. “Polarization-based vision through haze.” Applied optics, Volume 42, Issue 3, 2003, pp. 511-525. https://doi.org/10.1364/ao.42.000511
  13. Casasent, D. and Ye, A. “Detection filters and algorithm fusion for ATR.” IEEE transactions on image processing: a publication of the IEEE Signal Processing Society, Volume 6, Issue 1, 1997, pp. 114-125. https://doi.org/10.1109/83.552101
  14. Fasano, G.; Accardo, D.; Tirri, A.E.; Moccia, A. and De Lellis, V. “Morphological filtering and target tracking for vision-based UAS sense and avoid.” Conference session in Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, May 27-20, 2014, pp. 430-440. https://doi.org/0.1109/ICUAS.2014.6842283
  15. Qi, S.; Ma, J., Tao, C.; Yang, C. and Tian, V. “A robust directional saliency-based method for infrared small-target detection under various complex backgrounds.” IEEE Geoscience and Remote Sensing Letters, Volume 10, Issue 3, May 2013, pp. 495-499. https://doi.org/10.1109/LGRS.2012.2211094
  16. Wan, M.; Gu, G.l Cao, E.; Hu, X.; Qian, W. and Ren, K. “In-frame and inter-frame information based infrared moving small target detection under complex cloud backgrounds." Infrared Physics & Technology, Volume 76, pp. 455-467. https://doi.org/10.1016/j.infrared.2016.04.003
  17. Hu, S.; Goldman, G.H. and Borel-Donohue, C.C. “Detection of unmanned aerial vehicles using a visible camera system.” Applied Optics, Volume 56, Issue 3, 2017, pp. B214–B221. https://doi.org/10.1364/AO.56.00B214
  18. Cheraghi, S.A. and Sheikh, U.U. “Moving object detection using image registration for a moving camera platform.” Paper presentation in 2012 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, November 23-25, 2012. https://doi.org/10.1109/ICCSCE.2012.6487170
  19. Lefébure, M. and Cohen, L.D. “Image Registration, Optical Flow and Local Rigidity.” Journal of Mathematical Imaging and Vision, Volume 14, Issue 2, March 2001, pp. 131-147. https://doi.org/10.1023/A:1011259231755
  20. Farneback, G. “Very high accuracy velocity estimation using orientation tensors, parametric motion, and simultaneous segmentation of the motion field.” Poster presentation in Proceedings of the Eighth IEEE International Conference on Computer Vision, Vancouver, British Columbia, Canada, Volume 1, July 7-14, 2001, pp. 171-177. https://doi.org/10.1109/ICCV.2001.937514  
  21. Kennedy, H. L. (“Multidimensional digital filters for point-target detection in cluttered infrared scenes.” Journal of Electronic Imaging, Volume 23, Issue 6, 063019, December 17, 2014. https://doi.org/10.1117/1.JEI.23.6.063019
  22. Carnie, R.; Walker, R. and Corke, P. “Image processing algorithms for UAV “sense and avoid”.” Conference session in Proceedings of the 2006 IEEE International Conference on Robotics and Automation (ICRA) 2006, Orlando, FL. https://doi.org/10.1109/ROBOT.2006.1642133
  23. Jackson, J.A.; Boskovic, J. and Diel, D. “System-level Performance Analysis of a Closed-loop Sense and Avoid System for Unmanned Vehicles in the NAS.” AIAA Information Systems-AIAA Infotech @ Aerospace, Grapevine, TX, January 9-13, 2017. https://doi.org/10.2514/6.2017-0912
  24. Mahler, R. “The multisensor PHD filter: I. General solution via multitarget calculus.” Signal Processing, Sensor Fusion, and Target Recognition XVIII, 7336. SPIE Defense, Security, and Sensing, Orlando, FL, May 11, 2009. https://doi.org/10.1117/12.818024  
  25. Ristic, B.; Clark, D. and Vo, B-N. “Improved SMC implementation of the PHD filter.” 2010 13th International Conference on Information Fusion, 1–8, Edinburgh, United Kingdom. https://doi.org/10.1109/ICIF.2010.5711922
  26. Reuter, S.; Vo, B-T.; Vo, B-N. and Dietmayer, K. “The Labeled Multi-Bernoulli Filter.” IEEE Transactions on Signal Processing, Volume 62, Issue 12, May 14, 2014, pp. 324603260.  https://doi.org/10.1109/TSP.2014.2323064
  27. Dey, D., Geyer, C., Singh, S., & Digioia, M. (2010). Passive, long-range detection of aircraft: Towards a field deployable sense and avoid system. In A. Howard, & K. Iagnemma (Eds.) Field and Service Robotics. Springer Tracts in Advanced Robotics (Vol. 62). Springer. https://doi.org/10.1007/978-3-642-13408-1_11  
  28. Weinert, A.; Harkleroad, E.P.; Griffith, J.D.; Edwards, M. W. and Kochenderfer, M.J. “Uncorrelated Encounter Model of the National Airspace System, Version 2.0 (Report number ATC-404).” Massachusetts Institute of Technology Lincoln Laboratory, August 19, 2013. https://www.researchgate.net/publication/285407021_Uncorrelated_Encounter_Model_of_the_National_Airspace_System_Version_20
US Flag An Official Website of the United States Government