You are here
Collision Avoidance in Unmanned Air Vehicles using Novel Sensor Fusion (CAUSe)
Phone: (301) 294-4255
Phone: (301) 294-5221
For small UAS and UAM operations, it is crucial to obtain efficient collision avoidance between aircraft. These autonomous systems need to be equipped with a perception system that can estimate an obstacle’s 3D position, size, and type. Such applications require the combination from sensors providing 2D information (such as RGB cameras) and depth information (such as LIDAR). The high accuracy required from this combination of sensors leads to a high data throughput, and meeting the latency requirements for autonomy requires state-of-the-art processing. Our team proposes to develop algorithms for the perception layer of the autonomy stack that may be implemented with reduced computational load. The key ideas that we will bring to bear on this problem are from stochastic control, optimal estimation, and neural networks. Current approaches for perception apply a neural network to each frame, resulting in classification and bounding box information. In this effort, we will develop the algorithms for sensor fusion as well as for distinguishing far-off objects from noise with anomaly detection. Training and testing will be done using synthetic data in a simulated environment.
* Information listed above is at the time of submission. *