Road-objects tracking for autonomous driving using lidar and radar fusion
Wael Farag
In this paper, based on the fusion of Lidar and Radar measurement data, a real-time road-Object Detection and Tracking (LR_ODT) method for autonomous driving is proposed. The lidar and radar devices are installed on the ego car, and a customized Unscented Kalman Filter (UKF) is used for their data fusion. Lidars are accurate in determining objects' positions but significantly less accurate on measuring their velocities. However, Radars are more accurate on measuring objects velocities but less accurate on determining their positions as they have a lower spatial resolution. Therefore, the merits of both sensors are combined using the proposed fusion approach to provide both pose and velocity data for objects moving in roads precisely. The Grid-Based Density-Based Spatial Clustering of Applications with Noise (GB-DBSCAN) clustering algorithm is used to detect objects and estimate their centroids from the lidar and radar raw data. Then, the estimation of the object's velocity as well as determining its corresponding geometrical shape is performed by the RANdom SAmple Consensus (RANSAC) algorithm. The proposed technique is implemented using the high-performance language C++ and utilizes highly optimized math and optimization libraries for best real-time performance. The performance of the UKF fusion is compared to that of the Extended Kalman Filter fusion (EKF) showing its superiority. Simulation studies have been carried out to evaluate the performance of the LR_ODT for tracking bicycles, cars, and pedestrians.
Keywords: sensor fusion, kalman filter, object detection, object tracking, adas, autonomous driving
|