Browse Publications Technical Papers 2020-01-1029
2020-04-14

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment 2020-01-1029

Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the object-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ. It represented a complicated test environment with dynamic and static objects. The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS/IMU or LiDAR alone.

SAE MOBILUS

Subscribers can view annotate, and download all of SAE's content. Learn More »

Access SAE MOBILUS »

Members save up to 16% off list price.
Login to see discount.
Special Offer: Download multiple Technical Papers each year? TechSelect is a cost-effective subscription option to select and download 12-100 full-text Technical Papers per year. Find more information here.
We also recommend:
JOURNAL ARTICLE

Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations

2020-01-0093

View Details

TECHNICAL PAPER

Benchmarking the Localization Accuracy of 2D SLAM Algorithms on Mobile Robotic Platforms

2020-01-1021

View Details

TECHNICAL PAPER

Lidar Inertial Odometry and Mapping for Autonomous Vehicle in GPS-Denied Parking Lot

2020-01-0103

View Details

X