DLL: Direct Lidar Localization

Related tags

Deep Learningdll
Overview

DLL: Direct Lidar Localization

Summary

This package presents DLL, a direct map-based localization technique using 3D LIDAR for its application to aerial robots. DLL implements a point cloud to map registration based on non-linear optimization of the distance of the points and the map, thus not requiring features, neither point correspondences. Given an initial pose, the method is able to track the pose of the robot by refining the predicted pose from odometry. The method performs much better than Monte-Carlo localization methods and achieves comparable precision to other optimization-based approaches but running one order of magnitude faster. The method is also robust under odometric errors.

DLL is fully integarted in Robot Operating System (ROS). It follows the general localization apparoch of ROS, DLL makes use of sensor data to compute the transform that better fits the robot odometry TF into the map. Although an odometry system is recommended for fast and accurate localization, DLL also performs well without odometry information if the robot moves smoothly.

DLL experimental results in different setups

Software dependencies

There are not hard dependencies except for Google Ceres Solver and ROS:

Hardware requirements

DLL has been tested in a 10th generation Intel i7 processor, with 16GB of RAM. No graphics card is needed. The optimization is currently configured to be single threaded. You can easily reduce the processing time by a 33% just increasing the number of threads used by Ceres Solver.

Compilation

Download this source code into the src folder of your catkin worksapce:

$ cd catkin_ws/src
$ git clone https://github.com/robotics-upo/dll

Compile the project:

$ cd catkin_ws
$ source devel/setup.bash
$ catkin_make

How to use DLL

You can find several examples into the launch directory. The module needs the following input information:

  • A map of the environment. This map is provided as a .bt file
  • You need to provide an initial position of the robot into the map.
  • base_link to odom TF. If the sensor is not in base_link frame, the corresponding TF from sensor to base_link must be provided.
  • 3D point cloud from the sensor. This information can be provided by a 3D LIDAR or 3D camera.
  • IMU information is used to get roll and pitch angles. If you don't have IMU, DLL will take the roll and pitch estimations from odometry as the truth values.

Once launched, DLL will publish a TF between map and odom that alligns the sensor point cloud to the map.

When a new map is provided, DLL will compute the Distance Field grid. This file will be automatically generated on startup if it does not exist. Once generated, it is stored in the same path of the .bt map, so that it is not needed to be computed in future executions.

As example, you can download 5 datasets from the Service Robotics Laboratory repository (https://robotics.upo.es/datasets/dll/). The example launch files are prepared and configured to work with these bags. You can see the different parameters of the method. Notice that, except for mbzirc.bag, these bags do not include odometry estimation. For this reason, as an easy work around, the lauch files publish a fake odometry that is the identity matrix. DLL is faster and more accurate when a good odometry is available.

Cite

DLL has been accepted for publication in IROS 2021.

F. Caballero and L. Merino. "DLL: Direct LIDAR Localization. A map-based localization approach for aerial robots". Sumbitted to the International Conference on Intelligent Robots and Systems, IROS 2021.

You can download preliminar version of the the paper from arXiv

Comments
  • Using Livox mid 70 get bad result

    Using Livox mid 70 get bad result

    Hi, I use Livox mid 70 with wheel odometry and IMU, but the localization result is not good, the robot pose always "jump" when running. any idea to make a better result (stable, smooth, continues path)

    opened by gongyue666 9
  • Run other datasets

    Run other datasets

    hello!I saved a .ot file in dll/maps. And <arg name="map" default="myown.ot" /> But when I run the program , it shows "NULL otcomap". How come?Where else do I need to set the path?

    opened by MIke-1118 6
  • tested the given bag failed

    tested the given bag failed

    Hi, thanks for your great work! I have download the given bag for test the dll,but when i launched the launch file,it always shows the error,which is : " Octomap loaded Map size: x: 37.2 to 92.75 y: 41.95 to 95.65 z: -10.4 to 0.15 Res: 0.05 Error opening file /home/whx/study/dll_ws/src/dll/maps/airsim.grid for reading Computing 3D occupancy grid. This will take some time... [ INFO] [1640669470.668451692, 1614448809.604375476]: Progress: 0.000000 % [ INFO] [1640669471.163893210, 1614448810.107720910]: Progress: 0.021567 % [ INFO] [1640669471.668560708, 1614448810.612384198]: Progress: 0.039648 % [ INFO] [1640669472.172075265, 1614448811.115887848]: Progress: 0.053874 % [ INFO] [1640669472.680451449, 1614448811.624293216]: Progress: 0.065055 % [ INFO] [1640669473.184041975, 1614448812.127884273]: Progress: 0.073926 % ... ... [bag_player-2] process has finished cleanly log file: /home/whx/.ros/log/5879e12a-679f-11ec-9f57-c0e43482dfff/bag_player-2*.log " I have noticed there is a closed issue which talk about it,so i repeated the same test for many times.But it didn't work.

    I hope someone can help me solve the problem.

    Best wishes

    opened by numb0824 2
  • open map file failed

    open map file failed

    Thanks for your great works! I want to run your code just used roslaunch dll airsim1.launch and changed the true path about the .bag. But I meet the following error Screenshot from 2021-11-30 10-16-11 Could you help me how to solve the problem? Thanks.

    opened by huangsiyuan0717 2
  • Transform of input map

    Transform of input map

    Hello!

    I'd first like to thank you for this work, it's very interesting!

    I have a question regarding the internal representation of the map: when looking through the code I notice that you subtract the minimum values from each axis of the points. I suppose this is relevant for the method? I got some (obviously) poor results when I assumed the input map and internal representation were the same.

    I think it would be nice to make this clearer in the readme, or potentially add some transform between the original map and the internal representation such that the initial position set in the launch file could be relative the original map.

    opened by MartinEekGerhardsen 3
Releases(v1.1)
  • v1.1(Mar 22, 2022)

    Improved memory allocation and solver parameterization

    • Added use_yaw_increments parameter that uses yaw increments from IMU since last LIDAR update as initial guess for the optimizer. This is a good choice when robot performs very fast yaw rotations
    • Added grid trilinear interpolation computation online. This will reduce the DLL memory requirements by a factor of 7 approximatelly
    • Added parameters to set solver max iterations and max threads
    • Added comprehensive message when .grid files is no found
    Source code(tar.gz)
    Source code(zip)
  • v1.0(Mar 22, 2022)

    Initial Commit

    • This version contains the source code related wit the IROS paper detailed in the README
    • Some cleaning has been done to make it simpler to understand
    Source code(tar.gz)
    Source code(zip)
Owner
Service Robotics Lab
Service Robotics, Autonomous Robot Navigation, Machine Learning, Social Robotics
Service Robotics Lab
An educational AI robot based on NVIDIA Jetson Nano.

JetBot Looking for a quick way to get started with JetBot? Many third party kits are now available! JetBot is an open-source robot based on NVIDIA Jet

NVIDIA AI IOT 2.6k Dec 29, 2022
Single object tracking and segmentation.

Single/Multiple Object Tracking and Segmentation Codes and comparison of recent single/multiple object tracking and segmentation. News 💥 AutoMatch is

ZP ZHANG 385 Jan 02, 2023
Single-Shot Motion Completion with Transformer

Single-Shot Motion Completion with Transformer 👉 [Preprint] 👈 Abstract Motion completion is a challenging and long-discussed problem, which is of gr

FuxiCV 78 Dec 29, 2022
AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation

AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation

Frank Liu 26 Oct 13, 2022
Codebase for Inducing Causal Structure for Interpretable Neural Networks

Interchange Intervention Training (IIT) Codebase for Inducing Causal Structure for Interpretable Neural Networks Release Notes 12/01/2021: Code and Pa

Zen 6 Oct 10, 2022
Kaggle | 9th place single model solution for TGS Salt Identification Challenge

UNet for segmenting salt deposits from seismic images with PyTorch. General We, tugstugi and xuyuan, have participated in the Kaggle competition TGS S

Erdene-Ochir Tuguldur 276 Dec 20, 2022
INSPIRED: A Transparent Dialogue Dataset for Interactive Semantic Parsing

INSPIRED: A Transparent Dialogue Dataset for Interactive Semantic Parsing Existing studies on semantic parsing focus primarily on mapping a natural-la

7 Aug 22, 2022
Molecular Sets (MOSES): A Benchmarking Platform for Molecular Generation Models

Molecular Sets (MOSES): A benchmarking platform for molecular generation models Deep generative models are rapidly becoming popular for the discovery

MOSES 656 Dec 29, 2022
Code and experiments for "Deep Neural Networks for Rank Consistent Ordinal Regression based on Conditional Probabilities"

corn-ordinal-neuralnet This repository contains the orginal model code and experiment logs for the paper "Deep Neural Networks for Rank Consistent Ord

Raschka Research Group 14 Dec 27, 2022
A PyTorch Implementation of PGL-SUM from "Combining Global and Local Attention with Positional Encoding for Video Summarization", Proc. IEEE ISM 2021

PGL-SUM: Combining Global and Local Attention with Positional Encoding for Video Summarization PyTorch Implementation of PGL-SUM From "PGL-SUM: Combin

Evlampios Apostolidis 35 Dec 22, 2022
PyTorch implementation of "Simple and Deep Graph Convolutional Networks"

Simple and Deep Graph Convolutional Networks This repository contains a PyTorch implementation of "Simple and Deep Graph Convolutional Networks".(http

chenm 253 Dec 08, 2022
First-Order Probabilistic Programming Language

FOPPL: A First-Order Probabilistic Programming Language This is an implementation of FOPPL, an S-expression based probabilistic programming language d

Renato Costa 23 Dec 20, 2022
PiCIE: Unsupervised Semantic Segmentation using Invariance and Equivariance in clustering (CVPR2021)

PiCIE: Unsupervised Semantic Segmentation using Invariance and Equivariance in Clustering Jang Hyun Cho1, Utkarsh Mall2, Kavita Bala2, Bharath Harihar

Jang Hyun Cho 164 Dec 30, 2022
Advancing mathematics by guiding human intuition with AI

Advancing mathematics by guiding human intuition with AI This repo contains two colab notebooks which accompany the paper, available online at https:/

DeepMind 315 Dec 26, 2022
Code for the paper "JANUS: Parallel Tempered Genetic Algorithm Guided by Deep Neural Networks for Inverse Molecular Design"

JANUS: Parallel Tempered Genetic Algorithm Guided by Deep Neural Networks for Inverse Molecular Design This repository contains code for the paper: JA

Aspuru-Guzik group repo 55 Nov 29, 2022
GitHub repository for "Improving Video Generation for Multi-functional Applications"

Improving Video Generation for Multi-functional Applications GitHub repository for "Improving Video Generation for Multi-functional Applications" Pape

Bernhard Kratzwald 328 Dec 07, 2022
SwinTrack: A Simple and Strong Baseline for Transformer Tracking

SwinTrack This is the official repo for SwinTrack. A Simple and Strong Baseline Prerequisites Environment conda (recommended) conda create -y -n SwinT

LitingLin 196 Jan 04, 2023
Gapmm2: gapped alignment using minimap2 (align transcripts to genome)

gapmm2: gapped alignment using minimap2 This tool is a wrapper for minimap2 to r

Jon Palmer 2 Jan 27, 2022
Yet another video caption

Yet another video caption

Fan Zhimin 5 May 26, 2022
This repo will contain code to reproduce and build upon understanding transfer learning

What is being transferred in transfer learning? This repo contains the code for the following paper: Behnam Neyshabur*, Hanie Sedghi*, Chiyuan Zhang*.

4 Jun 16, 2021