內(nèi)容: 關(guān)于雷達(dá)和相機(jī)外參聯(lián)合標(biāo)定的踩坑紀(jì)錄。
Date: 2023/03/19
硬件:
- 上位機(jī): Jetson ORIN (Ubuntu 20.04, ROS noetic)
- 雷達(dá): Ouster 32線
- 相機(jī): Intel D435
一、 標(biāo)定方案
目前流行的 雷達(dá)+相機(jī) 標(biāo)定方案有五種:Autoware, apollo, lidar_camera_calibration, but_velodyne。
Ubuntu20.04安裝autoware我看bug比較多,因此我follow的是: https://github.com/ankitdhall/lidar_camera_calibration
1.1 opencv
這一步花了我一天時(shí)間,坑比較多,需要安裝opencv_contrib
才能進(jìn)行進(jìn)一步的操作,僅僅安裝opencv
是不行的!
我安的版本是3.4.10。
參考:
- https://blog.csdn.net/m0_73694897/article/details/129175515?spm=1001.2014.3001.5506
- https://blog.csdn.net/weixin_34910922/article/details/125248118?spm=1001.2014.3001.5506
二、 時(shí)間強(qiáng)行同步
直接實(shí)時(shí)或者直接播包的話就會(huì)發(fā)生: 只有mono8這一個(gè)窗口,沒有其他窗口彈出!
- Only Mono8 window visible · Issue #94 · ankitdhall/lidar_camera_calibration
- https://github.com/ankitdhall/lidar_camera_calibration/issues/65
- https://github.com/ankitdhall/lidar_camera_calibration/issues/53
因?yàn)檐囀庆o止的,如果時(shí)間戳不完全一致,標(biāo)定的程序是無(wú)法彈出點(diǎn)云窗口的,這個(gè)bug困擾了我很久,在issues中也沒找到也沒有人能給出解決方案,于是我把時(shí)間戳都強(qiáng)行設(shè)置到1970年,解決。
#!/usr/bin/env python3
import rospy
import rosbag
import sys
if sys.getdefaultencoding() != 'utf-8':
reload(sys)
sys.setdefaultencoding('utf-8')
bag_name = 'subset_2023-03-19-15-52-03.bag' #被修改的bag名
out_bag_name = 'change.bag' #修改后的bag名
dst_dir = '/aigo/' #使用路徑
with rosbag.Bag(dst_dir+out_bag_name, 'w') as outbag:
stamp = None
#topic:就是發(fā)布的topic msg:該topic在當(dāng)前時(shí)間點(diǎn)下的message t:消息記錄時(shí)間(非header)
##read_messages內(nèi)可以指定的某個(gè)topic
for topic, msg, t in rosbag.Bag(dst_dir+bag_name).read_messages():
if topic == '/ouster/points':
stamp = msg.header.stamp
# print("imu")
#print(stamp)
outbag.write(topic, msg, stamp)
#針對(duì)沒有header的文件,使用上面幀數(shù)最高的topic(即:/gps)的時(shí)間戳
##因?yàn)閞ead_messages是逐時(shí)間讀取,所以該方法可以使用
elif topic == '/camera/color/image_raw' and stamp is not None:
#print("lidar")
#print(msg.header.stamp)
outbag.write(topic, msg, stamp)
# continue
# #針對(duì)格式為Header的topic
elif topic == '/camera/depth/image_rect_raw' and stamp is not None:
#print("image")
#print(msg.header.stamp)
outbag.write(topic, msg, stamp)
# outbag.write(topic, msg, msg.stamp)
# continue
# #針對(duì)一般topic
#outbag.write(topic, msg, msg.header.stamp)
print("finished")
- 前:
- 后:
三、 標(biāo)定
我用的雷達(dá)是32線的,我修改了/home/nvidia/wuke/a2b_ws/src/lidar_camera_calibration/include/lidar_camera_calibration/PreprocessUtils.h
:
// line 200
// std::vector <std::vector<myPointXYZRID *>> rings(16);
std::vector <std::vector<myPointXYZRID *>> rings(32);
- 播包:
roscore
cd /aigo/
rosbag play -l --clock change.bag
- 標(biāo)定:
cd ~/wuke/a2b_ws
source devel/setup.bash
roslaunch lidar_camera_calibration find_transform.launch
四、 標(biāo)定結(jié)果
--------------------------------------------------------------------
After 40 iterations
--------------------------------------------------------------------
Average translation is:
0.00659326
-0.241034
-0.113246
Average rotation is:
0.901867 0.00575031 0.431975
-0.0142371 0.999764 0.0164154
-0.431778 -0.0209546 0.901736
Average transformation is:
0.901867 0.00575031 0.431975 0.00659326
-0.0142371 0.999764 0.0164154 -0.241034
-0.431778 -0.0209546 0.901736 -0.113246
0 0 0 1
Final rotation is:
0.0174841 -0.999826 -0.00654644
0.0208512 0.00691063 -0.999759
0.99963 0.0173434 0.0209684
Final ypr is:
0.873007
-1.54358
0.69106
Average RMSE is: 0.0291125
RMSE on average transformation is: 0.0298826
4.1 將雷達(dá)點(diǎn)云投影到相機(jī)坐標(biāo)系
上面的位姿變換是從雷達(dá)坐標(biāo)系到相機(jī)坐標(biāo)系的變換。
我打算把雷達(dá)點(diǎn)云變換到相機(jī)坐標(biāo)系下然后和深度圖對(duì)比,看重疊效果咋樣:
-
將D435的深度圖變成點(diǎn)云:
#include <ros/ros.h> #include <cv_bridge/cv_bridge.h> #include <sensor_msgs/Image.h> #include <sensor_msgs/CameraInfo.h> #include <sensor_msgs/PointCloud2.h> #include <pcl_conversions/pcl_conversions.h> #include <pcl/point_types.h> #include <pcl/PCLPointCloud2.h> #include <pcl/conversions.h> #include <iostream> using namespace std; ros::Publisher pub_point_cloud2; bool is_K_empty = 1; double K[9]; // [fx 0 cx] // K = [ 0 fy cy] // [ 0 0 1] void img_callback(const sensor_msgs::ImageConstPtr &img_msg) { // Step1: 讀取深度圖 //ROS_INFO("image format: %s %dx%d", img_msg->encoding.c_str(), img_msg->height, img_msg->width); int height = img_msg->height; int width = img_msg->width; // 通過(guò)指針強(qiáng)制轉(zhuǎn)換,讀取為16UC1數(shù)據(jù),單位是mm unsigned short *depth_data = (unsigned short*)&img_msg->data[0]; // Step2: 深度圖轉(zhuǎn)點(diǎn)云 sensor_msgs::PointCloud2 point_cloud2; pcl::PointCloud<pcl::PointXYZ>::Ptr cloud(new pcl::PointCloud<pcl::PointXYZ>); for(int uy=0; uy<height; uy++) { for(int ux=0; ux<width; ux++) { float x, y, z; z = *(depth_data + uy*width + ux) / 1000.0; if(z!=0) { x = z * (ux - K[2]) / K[0]; y = z * (uy - K[5]) / K[4]; pcl::PointXYZ p(x, y, z); cloud->push_back(p); } } } // Step3: 發(fā)布點(diǎn)云 pcl::toROSMsg(*cloud, point_cloud2); point_cloud2.header.frame_id = "world"; pub_point_cloud2.publish(point_cloud2); } void camera_info_callback(const sensor_msgs::CameraInfoConstPtr &camera_info_msg) { // 讀取相機(jī)參數(shù) if(is_K_empty) { for(int i=0; i<9; i++) { K[i] = camera_info_msg->K[i]; } is_K_empty = 0; } } int main(int argc, char **argv) { ros::init(argc, argv, "ros_tutorial_node"); ros::NodeHandle n; // 訂閱D435i的深度圖,在其回調(diào)函數(shù)中把深度圖轉(zhuǎn)化為點(diǎn)云,并發(fā)布出來(lái) ros::Subscriber sub_img = n.subscribe("/camera/depth/image_rect_raw", 100, img_callback); // 訂閱D435i的深度相機(jī)參數(shù) ros::Subscriber sub_cmara_info = n.subscribe("/camera/depth/camera_info", 1, camera_info_callback); pub_point_cloud2 = n.advertise<sensor_msgs::PointCloud2>("/d435i_point_cloud", 1000); ROS_INFO("Runing ..."); ros::spin(); return 0; }
-
發(fā)布tf變換消息:
rosrun tf2_ros static_transform_publisher 0 0 0 1.57 -1.57 0 world os_sensor
-
result:
-
raw
-
after (紅色是雷達(dá)點(diǎn)云,白色是深度相機(jī)的投影)
-
Reference
[1] https://github.com/ankitdhall/lidar_camera_calibration/wiki/Welcome-to-%60lidar_camera_calibration%60-Wiki!
[2] 3D雷達(dá)與相機(jī)的標(biāo)定方法詳細(xì)教程_lidar_camera_calibration_敢敢のwings的博客-CSDN博客
[3] https://blog.csdn.net/m0_73694897/article/details/129175515?spm=1001.2014.3001.5506
[4] https://blog.csdn.net/weixin_34910922/article/details/125248118?spm=1001.2014.3001.5506
[5] https://blog.csdn.net/weixin_42840360/article/details/119844648?spm=1001.2014.3001.5506文章來(lái)源:http://www.zghlxwxcb.cn/news/detail-500123.html
[6] https://blog.csdn.net/Jeff_zjf/article/details/124255916?ops_request_misc=%257B%2522request%255Fid%2522%253A%2522167922032616800213053412%2522%252C%2522scm%2522%253A%252220140713.130102334.pc%255Fall.%2522%257D&request_id=167922032616800213053412&biz_id=0&utm_medium=distribute.pc_search_result.none-task-blog-2allfirst_rank_ecpm_v1~pc_rank_34-1-124255916-null-null.142文章來(lái)源地址http://www.zghlxwxcb.cn/news/detail-500123.html
到了這里,關(guān)于相機(jī)和雷達(dá)外參聯(lián)合標(biāo)定的文章就介紹完了。如果您還想了解更多內(nèi)容,請(qǐng)?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!