Particle filter meets hybrid octrees: an octree-based ground vehicle localization approach without learning

April 2023
Engineering and Numerical Tools
Articles dans des revues internationales ou nationales avec comité de lecture
Auteurs : Vincent VAUCHEY (LINEACT), Yohan DUPUIS (LINEACT), Pierre MERRIAUX (Sans affiliation), Xavier SAVATIER (Sans affiliation)
Journal : Applied Intelligence, 7 April 2023

This paper proposes an accurate lidar-based outdoor localization method that requires few computational resources, is robust in challenging environments (urban, off-road, seasonal variations) and whose performances are equivalent for two different sensor technologies: scanning LiDAR and flash LiDAR. The method is based on the matching between a pre-built 3D map and the LiDAR measurements. Our contribution lies in the combined use of a particle filter with a hybrid octree to reduce the memory footprint of the map and significantly decrease the computational load for online localization. The design of the algorithm allows it to run on both CPU and GPU with equivalent performance. We have evaluated our approach on the KITTI dataset and obtained good results compared to the state of the art. This paper introduces the baseline performance on a multi-seasonal dataset we are publicly releasing to the community. We have shown that the same localization algorithms and parameters can perform well in urban environments and can be extended to off-road environments. We have also evaluated the robustness of our method when masking angular sectors of the LiDAR field of view to reproduce edge-cases scenarios in urban environments where the LiDAR field is partially occulted by another vehicle (bus, truck). Finally, experiments have been carried out with two distinctive scanning and flash LiDAR technologies. The performance achieved with the flash LiDAR is close to the scanning LiDAR despite different resolutions and sensing modalities. The positioning performance is significant with 10cm and 0.12° angular RMSE for both technologies. We validated our approach in an off-road environment from a front view field of view with only 768 LiDAR points.