You are here

Visible Electro-Optical (EO) System and LIDAR Fusion for Low Cost Perception by Autonomous Ground Vehicles

Description:

OBJECTIVE: Develop a low-cost perception/classification system for the joint exploitation of LIDAR and passive multi-spectral data obtained across the visible spectrum employing self-calibrating algorithms for use in autonomous ground vehicles DESCRIPTION: Unmanned Ground Vehicles (UGVs) are an important part of the Navy"s ongoing technology strategy. The developing autonomy capabilities of today"s UGVs are driving requirements for sophisticated sensing which may include coordinated sensors. Passive electro-optical visible (EO) image-based sensing provides a detailed picture of the environment but the picture can often be difficult to interpret in terms of the local terrain structure. Spatially sparse range information can substantially aid the process of interpretation. Methods are available to infer the range information from the visible EO-sensor developed scene by stereoscopic or algorithmic methods. LIDAR sensors are capable of providing a point cloud of distance and intensity information in real-time that provide an alternative approach to creating the ranging information desired for augmentation of the EO sensor developed scene. While LIDAR, multi-spectral visible sensors, and their associated perception algorithms are maturing, methods for fusing information from the sensors and their associated self-calibration are generally less well developed. An integrated approach to fusion will drive the need for precise calibration to recover both the internal characteristics of the sensors (intrinsic parameters) and the position and orientation of the sensors with respect to the overall system (extrinsic parameters). The resulting calibration procedures using state-of-the-art methods will be complicated and time consuming. Manual calibration significantly complicates the fielding and maintenance of UGVs. Furthermore, failure to maintain calibration in deployed systems can lead to poor performance in the field This topic seeks to develop methods that fuse the features that can be extracted in a low-cost visible multi-spectral (less than 10 bands) EO system with LIDAR developed sparse point clouds of distance and intensity information. The development of a fusion algorithm that reduces the required density of the LIDAR point cloud is expected to lower the cost of the combined EO/LIDAR perception system. This topic seeks to reduce dependence on in-the-field calibration. Sensor modules must be field-replaceable without requiring explicit calibration, and must tolerate prolonged use under harsh conditions without requiring explicit recalibration. When vibration, mechanical damage, or maintenance procedures introduce changes to the sensor calibration, the system must be"self-healing"so that the loss of calibration is corrected without intervention from military personnel. PHASE I: Design a concept for a low cost (<$30,000 at 1000 unit annual production rate) modular, self-calibrating fused LIDAR/EO perception system for an autonomous ground vehicle. The system shall be able to provide perception about the environment sufficient so that an autonomous vehicle can perform mission-level adaptation in response to real-world contingencies in a multiple terrain types and environments without human intervention. The architecture shall strive to minimize the combined cost of the perception suite by optimal allocation of the requirements and exploitation of the benefits of a true data fusion scheme. PHASE II: Build a demonstration sensor system based on Phase I design, with attention to power constraints and the use of operationally appropriate embedded hardware. Demonstrate effective self-calibration on logged data, with results comparable to those obtained using manual calibration routines. Conduct static experiments to demonstrate perception performance in structured lab conditions. Then conduct static experiment in more complex unstructured terrestrial environments. . Conduct in-the-field testing to demonstrate self-calibration and on-the-fly recovery from changes in sensor calibration PHASE III: Demonstrate a robust capability (without the use of GPS) of an autonomous unmanned ground vehicle using a"low cost"sensor suite composed of fused LIDAR and visible EO sensors conducting a resupply mission in a militarily relevant manner while executing complex and doctrinally correct behaviors. PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Unmanned vehicles are becoming increasingly important for agriculture, mining, and other private sector applications. Many of these applications will benefit directly from robust, modular, self-calibrating sensor technology. REFERENCES: 1. L. Matthies, A. Kelly, and G. Tharp, Obstacle detection for unmanned ground vehicles: A. In G. Giralt and G. Hirzinger, editors, Robotics Research, The Seventh International Symposium (ISRR"95), pages 475486. Springer-Verlag, 1996. 2. G. A. Blackburn, Remote sensing of forest pigments using airborne imaging spectrometer and LIDAR imagery, Remote Sens. Environ., vol. 82, no. 2/3, pp. 311321, Oct. 2002. 3. J. Neira, J.D. Tardos, J. Horn, and G. Schmidt. Fusing range and intensity images for mobile robot localization. IEEE Trans. Robotics and Automation, 15(1):7684, 1999. 4. R. Manduchi, A. Castano, A. Talukder and L. Matthies, Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation, Autonomous Robots, Volume 18, Number 1, 81-102, 2005, DOI: 10.1023/B:AURO.0000047286.62481.1d 5. Stoyanov, D. (2008). Camera Calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/calib_doc/index.html. Accessed August 30, 2010. 6. Underwood, J.P., Hill, A., Peynot, T., & Scheding, S.J. (2010). Error modeling and calibration of exteroceptive sensors for accurate mapping applications. Journal of Field Robotics, 27(1), 2-20. 7. Mirzaei, F., & Roumeliotis, S. (2007). A Kalman filter-based algorithm for IMU-camera calibration. IEEE/RSG International Conference on Intelligent Robots and Systems, San Diego, CA.
US Flag An Official Website of the United States Government