LeGo-LOAM
A lightweight and ground optimized lidar odometry and mapping (LeGO-LOAM) system for ROS compatible UGVs. The system takes in point cloud from a Velodyne VLP-16 Lidar (placed horizontal) and optional IMU data as inputs. It outputs 6D pose estimation in real-time.

A brief introduction to logo-LOAM (laser SLAM, IMU+LiDAR) is given. Based on LOAM, it achieves the same accuracy and greatly reduces the amount of calculation. Using point cloud segmentation to distinguish noise, extract plane and edge features, use two-step LM optimization method to solve pose, and add loop detection to reduce drift.
Next, go straight to the code section:
github: https://github.com/RobustFieldAutonomyLab/LeGO-LOAM

First, let's take a look at the code structure. It is very clear. cloud_msgs, launch, include, src (core program). Let's take a look at the following:
-
CMakeList.txt
... find_package(GTSAM REQUIRED QUIET) find_package(PCL REQUIRED QUIET) find_package(OpenCV REQUIRED QUIET) ...
Dependent libraries: GTSAM, PCL, OPenCV
Here is a brief introduction to GTSAM. GTSAM is a C++ library for smoothing and mapping in the field of robotics and computer vision. It uses factor graphs and Bayesian networks to maximize the posterior probability. If you want to learn more, you can see resources [2].
-
launch/run.launch
Nodes: rviz, camera_init_to_map, base_link_to_camera, imageProjection, featureAssociation, mapOptmization, transformFusion
The main nodes are the last four, which are also the core programs under the corresponding src directory.
-
include/utility.h
Define data structure: point cloud, topic, lidar parameters, image segmentation parameters, feature extraction parameters, optimization parameters, mapping parameters
... /* * A point cloud type that has "ring" channel */ struct PointXYZIR { PCL_ADD_POINT4D; PCL_ADD_INTENSITY; uint16_t ring; EIGEN_MAKE_ALIGNED_OPERATOR_NEW } EIGEN_ALIGN16; POINT_CLOUD_REGISTER_POINT_STRUCT (PointXYZIR, (float, x, x) (float, y, y) (float, z, z) (float, intensity, intensity) (uint16_t, ring, ring) ) ...
Here, we define a_ ID point cloud data type. velodyne lidar has this data, but other manufacturers may not have it; Different from laser_id, ring_id refers to the line number (0-15 for VLP-16) that is incremented from the lowest laser line.
... // Using velodyne cloud "ring" channel for image projection (other lidar may have different name for this channel, change "PointXYZIR" below) extern const bool useCloudRing = false; // if true, ang_res_y and ang_bottom are not used // VLP-16 extern const int N_SCAN = 16;//number of lasers extern const int Horizon_SCAN = 1800;//number of points from one laser of one scan extern const float ang_res_x = 0.2;//angle resolution of horizon extern const float ang_res_y = 2.0;//angle resolution of vertical extern const float ang_bottom = 15.0+0.1; extern const int groundScanInd = 7;//Indicates the number of coils required on the ground; ...
The parameter settings of different laser radars should be adjusted according to their own actual conditions. If you use the data of a data set, such as KITTI, you need to adjust for HDL-64E. Moreover, because the vertical angle resolution of the radar is not uniform, you also need to write your own subsequent projection program.
Next, let's cut into the core program. Since there are four CPPs, in order not to be too long, let's analyze imageproject cpp.
-
src/imageProjection.cpp
Overall process: subscribe to point cloud data callback processing - > point cloud conversion to pcl preprocessing - > intercept a frame of laser data - > projection mapping to image - > ground removal - > point cloud segmentation - > publish point cloud data - > Reset parameters;
Callback processing function subscribed to lidar data:
void cloudHandler(const sensor_msgs::PointCloud2ConstPtr& laserCloudMsg){ // 1. Convert ros message to pcl point cloud copyPointCloud(laserCloudMsg);//Use pcl library function; // 2. Start and end angle of a scan findStartEndAngle();//1. the angle difference of one circle of data is calculated by atan2; 2. pay attention to the rationality of the range of calculation results // 3. Range image projection projectPointCloud();//Project the lidar data into a 16x1800 (depending on the radar angular resolution) point cloud array // 4. Mark ground points groundRemoval();//Calculate the pitch angle between the two lines according to the position of the points between the upper and lower lines. If it is less than 10 degrees, it is the ground point // 5. Point cloud segmentation cloudSegmentation();//Firstly, the point cloud is labeled by clustering, and the corresponding point cloud block is stored according to the label; // 6. Publish all clouds publishCloud();//Publish various point cloud data // 7. Reset parameters for next iteration resetParameters(); }
projectPointCloud(): Map 3D point cloud projection to 2D range image
// find the row and column index in the image for this point if (useCloudRing == true){ rowIdn = laserCloudInRing->points[i].ring; }else{ verticalAngle = atan2(thisPoint.z, sqrt(thisPoint.x * thisPoint.x + thisPoint.y * thisPoint.y)) * 180 / M_PI; rowIdn = (verticalAngle + ang_bottom) / ang_res_y;//Determine row index } horizonAngle = atan2(thisPoint.x, thisPoint.y) * 180 / M_PI; columnIdn = -round((horizonAngle-90.0)/ang_res_x) + Horizon_SCAN/2;//Determine column index range = sqrt(thisPoint.x * thisPoint.x + thisPoint.y * thisPoint.y + thisPoint.z * thisPoint.z);//Determine depth
Lidar coordinate system
atan2(y,x);
The angle of the arctangent is equal to the included angle between the X axis and the line passing through the origin and the given coordinate point (x, y). A positive result indicates the angle of counterclockwise rotation from the X axis, and a negative result indicates the angle of clockwise rotation from the X axis;
atan2 function
1. determine row index
Calculate according to ring id or coordinates:
θv=atan2(z/(x2+y2)1/2)\theta_v=atan2(z/(x^2+y^2)^{1/2})θv=atan2(z/(x2+y2)1/2)
row=(θv+θbottom)/resolutionvrow = (\theta_v+\theta_{bottom})/resolution_vrow=(θv+θbottom)/resolutionv
2. determine the column index ([− π, π][-\pi,\pi][− π, π] is converted to [01800][01800][01800]): (the front of the y axis is the middle dividing line)
θh=atan2(x/y)∗180/π\theta_h=atan2(x/y)*180/\piθh=atan2(x/y)∗180/π
col=−round((θh−90)/resolutionh)+pointsizeoflaserperscanper/2col=-round((\theta_h-90)/resolution_h)+pointsizeoflaserperscanper/2col=−round((θh−90)/resolutionh)+pointsizeoflaserperscanper/2
3. determine the depth value: (pay attention to the effective range of the data)
range=(x2+y2+z2)(1/2)range=(x^2+y^2+z^2)^{(1/2)}range=(x2+y2+z2)(1/2)
groundRemoval(): ground removal
// groundMat // -1, no valid info to check if ground of not // 0, initial value, after validation, means not ground // 1, ground for (size_t j = 0; j < Horizon_SCAN; ++j){ for (size_t i = 0; i < groundScanInd; ++i){ //groundScanInd defines the number of laser lines hitting the ground lowerInd = j + ( i )*Horizon_SCAN; upperInd = j + (i+1)*Horizon_SCAN; if (fullCloud->points[lowerInd].intensity == -1 || fullCloud->points[upperInd].intensity == -1){ // no info to check, invalid points groundMat.at<int8_t>(i,j) = -1; continue; } diffX = fullCloud->points[upperInd].x - fullCloud->points[lowerInd].x; diffY = fullCloud->points[upperInd].y - fullCloud->points[lowerInd].y; diffZ = fullCloud->points[upperInd].z - fullCloud->points[lowerInd].z; angle = atan2(diffZ, sqrt(diffX*diffX + diffY*diffY) ) * 180 / M_PI; //If the pitch angle of two adjacent points in the vertical direction is less than 10 degrees, it is determined as the ground point; Adjacent scanning circle if (abs(angle - sensorMountAngle) <= 10){ groundMat.at<int8_t>(i,j) = 1; groundMat.at<int8_t>(i+1,j) = 1; } } }
Ground judgment principle
If the pitch angle of two vertical adjacent points is less than 10 degrees, it is determined as a ground point.
cloudSegmentation():
void cloudSegmentation(){ // segmentation process for (size_t i = 0; i < N_SCAN; ++i) for (size_t j = 0; j < Horizon_SCAN; ++j) if (labelMat.at<int>(i,j) == 0) labelComponents(i, j);//->Labelcomponents() clustering operation ... } //Local feature detection clustering //Plane point and edge point void labelComponents(int row, int col){ ... while(queueSize > 0){ // Pop point fromIndX = queueIndX[queueStartInd]; fromIndY = queueIndY[queueStartInd]; --queueSize; ++queueStartInd; // Mark popped point labelMat.at<int>(fromIndX, fromIndY) = labelCount; // Loop through all the neighboring grids of popped grid //Upper, lower, left and right neighborhood points for (auto iter = neighborIterator.begin(); iter != neighborIterator.end(); ++iter){ // new index thisIndX = fromIndX + (*iter).first; thisIndY = fromIndY + (*iter).second; // index should be within the boundary if (thisIndX < 0 || thisIndX >= N_SCAN) continue; // at range image margin (left or right side) if (thisIndY < 0) thisIndY = Horizon_SCAN - 1; if (thisIndY >= Horizon_SCAN) thisIndY = 0; // prevent infinite loop (caused by put already examined point back) if (labelMat.at<int>(thisIndX, thisIndY) != 0) continue; //Compare larger depth with smaller depth d1 = std::max(rangeMat.at<float>(fromIndX, fromIndY), rangeMat.at<float>(thisIndX, thisIndY)); d2 = std::min(rangeMat.at<float>(fromIndX, fromIndY), rangeMat.at<float>(thisIndX, thisIndY)); if ((*iter).first == 0) alpha = segmentAlphaX; else alpha = segmentAlphaY; angle = atan2(d2*sin(alpha), (d1 -d2*cos(alpha)));//Plane clustering formula: the larger the angle, the more the two points tend to be plane //segmentTheta=60 degrees if (angle > segmentTheta){ queueIndX[queueEndInd] = thisIndX; queueIndY[queueEndInd] = thisIndY; ++queueSize; ++queueEndInd; labelMat.at<int>(thisIndX, thisIndY) = labelCount; lineCountFlag[thisIndX] = true; allPushedIndX[allPushedIndSize] = thisIndX; allPushedIndY[allPushedIndSize] = thisIndY; ++allPushedIndSize; } } } // check if this segment is valid bool feasibleSegment = false; if (allPushedIndSize >= 30) feasibleSegment = true;//Pastry else if (allPushedIndSize >= segmentValidPointNum){ int lineCount = 0; for (size_t i = 0; i < N_SCAN; ++i) if (lineCountFlag[i] == true) ++lineCount; if (lineCount >= segmentValidLineNum) feasibleSegment = true;//Edge point } // segment is valid, mark these points if (feasibleSegment == true){ ++labelCount; }else{ // segment is invalid, mark these points for (size_t i = 0; i < allPushedIndSize; ++i){ labelMat.at<int>(allPushedIndX[i], allPushedIndY[i]) = 999999; } } }
Take the labelMat with the ground removed, calculate 4 points in the neighborhood point by point (pay attention to the effective range of the neighborhood index), compare the maximum value and the minimum value of the distance with the center point, and combine the threshold π /3\pi/3 π /3 through the following formula. If the value is greater than this value, the plane feature will be formed as a plane point. The array stores the index, labelMat stores the label value, and lineCountFlag is set to true until the threshold is not met, jump out and proceed to the next step.
θx=atan2((dmin∗sin(radresolutinonx)/(dmax−dmin∗cos(radresolutinonx)))\theta_x=atan2((d_{min}*sin(rad_{resolutinon_x})/(d_{max}-d_{min}*cos(rad_{resolutinon_x})))θx=atan2((dmin∗sin(radresolutinonx)/(dmax−dmin∗cos(radresolutinonx)))
θy=atan2((dmin∗sin(radresolutinony)/(dmax−dmin∗cos(radresolutinony)))\theta_y=atan2((d_{min}*sin(rad_{resolutinon_y})/(d_{max}-d_{min}*cos(rad_{resolutinon_y})))θy=atan2((dmin∗sin(radresolutinony)/(dmax−dmin∗cos(radresolutinony)))
Plane judgment principle
Clustering, more than 30 points are clustered into a class, labelCount++; If it is less than 30 and more than 5, the number of cluster points in the vertical direction will be counted, and if it is more than 3, it will also be marked as a class; If none is satisfied, the value 999999 indicates the cluster points to be discarded.
publishCloud(): point cloud Publishing
resetParameters(): parameter reset
References:
[1]. Shan T, Englot B. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018: 4758-4765.
[2]. gtsam_tutorial video
[3].pcl::fromROSMsg & pcl::toROSMsg
[4].pcl::removeNaNFromPointCloud
[5].velodyne_pointcloud::PointXYZIR Struct Reference
[6].atan2