You are here

Terrain-Dependent Driving Control for Medical Robots and Mobility Assist Devices

Description:

OBJECTIVE: Develop autonomous terrain classification and driving control systems that enable medical robots and mobility assist devices to safely negotiate various types of terrain. Applications would include casualty assessment/extraction robots, chem/bio-hazard detection robots, and electric-powered wheelchairs. DESCRIPTION: The military is currently developing several robotic platforms for casualty triage/assessment, casualty extraction, medical resupply, and chemical/biological hazard detection that require navigation of hazardous terrain such as steep inclines, ice, mud, and loose gravel. These conditions can cause wheel/track slippage, sinkage, or in the worst case, overturn, resulting in end of mission. Hazardous terrain may also threaten electric-powered wheelchairs (EPW) and other mobility aids for wounded warriors and disabled veterans. Developing a driving control system that can reliably detect hazardous terrain and then implement safe driving control strategies would therefore be of great benefit to the military. To accomplish these objectives, two systems will need to be developed: (a) terrain sensing and classification system, and (b) terrain-dependent driving control system. The terrain classifier would identify the type of terrain (sand, grass, gravel, mud) based on visual and/or tactile sensory input. The robot control system would then make appropriate adjustments to driving parameters such as speed, acceleration, turning radius, and braking to avoid loss of control. Additional sensors may need to be retrofitted to the robot"s drive system to enable closed-loop control of the drive system. Terrain classification falls into two basic approaches: visual and tactile. Many autonomous ground vehicles developed for the military use vision-based approaches employing cameras and LIDAR [1]. While visual methods have been effectively showcased in events such as the DARPA Grand Challenge [2] and even on the surface of Mars [3], their performance can be greatly diminished under wet/snowy conditions, superficial ground coverings, and shadows. In addition, vision approaches may also have difficulty distinguishing between surfaces that look similar but have different texture, such as dirt versus mud. Alternatively, tactile approaches employ vibration sensors and measurements for slip estimation. Vibration methods are robust with respect to climatic conditions and ground coverings and are naturally able to sense the roughness in terrain that will directly affect driving control [4]. However, they cannot"look"ahead like cameras to anticipate dramatic changes in conditions such as an icy patch on the road. Slip estimation is often used to distinguish between wet and dry conditions, but implementation is challenging [5,6]. The most effective terrain classification system might therefore combine both visual and tactile approaches. Off-road driving experts have developed substantial lists of terrain-dependent driving rules in several sources to improve driving safety and efficiency on potentially hazardous surfaces such as snow, ice, mud, and loose gravel [7,8,9]. The main strategy behind these rules is to reduce wheel slip and thereby improve traction. The"Terrain Response"system used on the Land Rover LR3 and Freelander commercial vehicles [10] uses low-level, terrain-dependent feedback to prevent slipping via traction control and the anti-lock brake system (ABS). The operator selects from one of five modes (everyday, grass/gravel/snow, mud/ruts, sand, and rock crawl), which are then used to alter settings on the engine throttling, ABS, transmission, differentials, and traction/stability control systems. A substantial body of additional research has been conducted in the academic community on terrain-dependent traction control and ABS [11,12]. The Army has a demonstrated clear need for terrain-dependent driving control in medical robotics and assistive device applications. Such a system would significantly reduce risk and increase the probability of mission success. Recent advances in sensor technology, particularly in lasers/LIDAR and piezo-electric chips, have not only made deployment of terrain-dependent driving control feasible, but economically viable as well. This project is an initial step toward that goal. PHASE I: Determine the types of hazardous terrain that must be navigated by medical robots and mobility assist devices such as electric-powered wheelchairs. Identify terrain classification features that will allow safe passage across hazardous terrain. Design a preliminary terrain classification system and identify the sensors required. Conduct a literature survey of driving control rules for off-road vehicles and identify key driving control strategies that could be implemented in medical robotics applications. Develop a research plan for Phase II. PHASE II: Develop, integrate, and test terrain-dependent Joint Architecture for Unmanned Systems (JAUS) driving control on both a two-wheel drive mobility assist device (e.g., EPW) and a four-wheel drive medical robots. JAUS documentation and specification can be found at Referenced 13-15. Collect data to determine how these robots perform on different terrains and surfaces under different conditions including speed, turning radii, and acceleration. Use the collected data to determine driving rules for different terrains and integrate into the robot control system. Collect vibration and slip estimation data on a variety of terrains and surfaces at multiple speeds. Tune the terrain classifier for terrain-dependent control, and configure the terrain classifier for multiple speeds and loads. Develop a commercialization plan for Phase III. PHASE III DUAL USE APPLICATIONS: Assist the Army in transitioning and implementing the terrain classification and terrain-dependent driving control systems in two commercial applications: a four-wheel drive robot and electric-powered wheelchairs. Develop and market an inexpensive commercial version of the terrain classification and driving control systems for deployment on passive wheelchairs. REFERENCES: 1. A. Angelova, L. Matthies, D. Helmick, and P. Perona,"Fast terrain classification using variable-length representation for autonomous navigation,"Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 18, June 2007. 2. J. Markoff:"Crashes and Traffic Jams in Military Test of Robotic Vehicles", New York Times, Nov. 5, 2007, http://www.nytimes.com/2007/11/05/technology/05robot.htm?_r=1 3. I. Halatci, C. A. Brooks, and K. Iagnemma,"Terrain classification and classifier fusion for planetary exploration rovers,"Proc. of the IEEE Aerospace Conference, pp. 111, March 2007. 4. K. Iagnemma, K. and S. Dubowsky,"Terrain estimation for high speed rough terrain autonomous vehicle navigation,"Proc. of the SPIE Conf. for Unmanned Ground Vehicle Technology IV, Orlando, Florida, 2002. 5. A. Angelova, L. Matthies, D. Helmick, and P. Perona,"Learning and Prediction of Slip from Visual Information,"Journal of Field Robotics, Vol. 24, No. 3, pp. 205-231, 2007. 6. D. Helmick, A. Angelova, and L. Matthies,"Terrain Adaptive Navigation for Planetary Rovers, Journal of Field Robotics, Vol. 26, No. 4, pp. 391-410, 2009. 7. B. Delong, 4-Wheel Freedom: The Art of Off-Road Driving, Paladin Press, Boulder, CO, 2000. 8. J. Allen, Four-Wheeler's Bible, MotorBooks, St Paul, MN, 2002. 9. C. Ordonez, and E. G. Collins, Jr.,"Rut Detection for Mobile Robots,"Proc. of the IEEE 40th Southeastern Symposium on System Theory, New Orleans, LA, pp. 334-337, March 16-18, 2008. 10. D. Vanderwerp,"What Does Terrain Response Do?", http://www.caranddriver.com/features/9026/ whatdoes-terrain-response-do.html 2005. 11. J. Van Der Burg and P. Blazevic,"Anti-lock braking and traction control concept for all-terrain robotic vehicles,"Proceedings of the IEEE International Conference on Robotics and Automation, Albuquerque, New Mexico, 1997. 12. P. Khatun, C. M. Bingham, N. Schofield, and P. H. Mellor,"Application of fuzzy control algorithms for electric vehicle antilock braking/traction control systems"IEEE Transactions on Vehicular Technology, Vol.52, No. 5, pp. 13561364, 2003. 13. Joint Architecture for Unmanned Systems (JAUS) Documentation: http://www.openjaus.com/support/jaus-documents 14. JAUS Transport Specfication: http://standards.sae.org/as5669a/ 15. Society of Automotive Engineers (SAE) Standard AS5669, JAUS/SDP Transport Specification: http://sae.nufu.eu/std/AS5669
US Flag An Official Website of the United States Government