<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>ROS | JunyiGu(Claude)</title>
    <link>https://junyigu-claude.com/tag/ros/</link>
      <atom:link href="https://junyigu-claude.com/tag/ros/index.xml" rel="self" type="application/rss+xml" />
    <description>ROS</description>
    <generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Wed, 20 Apr 2022 23:52:29 +0300</lastBuildDate>
    
    
    <item>
      <title>Using Autoware to Calibrate LiDAR and Camera for ISEAUTO shuttle</title>
      <link>https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/</link>
      <pubDate>Wed, 20 Apr 2022 23:52:29 +0300</pubDate>
      <guid>https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/</guid>
      <description>








  





&lt;video controls  &gt;
  &lt;source src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/cam_lidar_project.mp4&#34; type=&#34;video/mp4&#34;&gt;
&lt;/video&gt;
&lt;div style=&#39;text-align: justify&#39;&gt;
The calibration of the sensors is a critical work before implementing them to any 
practical usages. As the two most popular sensors that are used in modern 
autonomous vehicles, camera and LiDAR, somehow have been discussed for a long time 
that which one is the future for the autonomous driving. Tesla is the most iconic 
representation that openly declare the fully reliance of visual system for their 
cars. However, most of the other autonomous driving vehicles still choose to install 
the camera and LiDAR sensors together in their vehicle and use both of them as 
primary perceptive sensors. Therefore, knowing the extrinsic information between 
the camera and LiDAR is essential when it involves the collaboration of two sensors. 
Shortly speaking, there is need to know the relative transformation 
(translation + rotation) between the camera and LiDAR. Please noted that the 
&#39;transformation&#39; in this scenario is more than the distance of the physical shells 
of two sensors, but requires the matching each point from LiDAR to the corresponding 
pixel of image. 
&lt;/p&gt;
For iseAuto shuttle, the primary LiDAR is the Velodyne VLP-32C and the primary camera is 
Grasshopper3, the detailed specification can be found in this &lt;a href=&#34;https://junyigu-claude.com/uploads/object-segmentation-for-autonomous-driving-using-iseauto-data.pdf&#34; target=&#34;_blank&#34;&gt;paper&lt;/a&gt;. 
&lt;/p&gt;
The reason to use the Autoware to do the camera-LiDAR calibration is because it 
provides a user-friendly interface that is based on the common ROS visualization 
tools; Rviz was used to select the LiDAR points and Image Viewer was used to select 
the corresponding image pixels. Here assume the readers of this 
post have had the basic knowledge of the ROS. 
&lt;p&gt;The detailed instruction of the Autoware.ai installation and compiling can be found
&lt;a href=&#34;https://github.com/autowarefoundation/autoware_ai_documentation/wiki/Installation&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;here&lt;/a&gt;.
In this post, all necessary materials recorded by iseAuto shuttle to replicate
the camera-LiDAR extrinsic calibration will be available for downloading.&lt;/p&gt;
&lt;h3 id=&#34;1-recording-the-bag-file-for-camera-lidar-extrinsic-calibration&#34;&gt;1. Recording the bag file for camera-LiDAR extrinsic calibration&lt;/h3&gt;
&lt;p&gt;The first step is recording the camera and LiDAR data with the checkerboard.
The checkerboard should be placed in different locations that cover all camera views
from left to right, bottom to top, and close to faraway. The exact locations should
be based on the size of ckeckerboard and the resolution of the sensors.&lt;/p&gt;
&lt;p&gt;&lt;a href=&#34;https://www.roboticlab.eu/claude/bags/flir_velodyne_calib.bag&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Here&lt;/a&gt; is the
checkerboard bag file for the iseAuto front camera and LiDAR. The details of the
bag file is like this:
















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig1_hu8bb5825a2196bb62e119e7faeb67a71c_135805_f592d61a6921b2bf2c224ded25c6f44f.webp 400w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig1_hu8bb5825a2196bb62e119e7faeb67a71c_135805_186eaee2b6f5ff73e176ccd9a39352d6.webp 760w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig1_hu8bb5825a2196bb62e119e7faeb67a71c_135805_1200x1200_fit_q75_h2_lanczos_3.webp 1200w&#34;
               src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig1_hu8bb5825a2196bb62e119e7faeb67a71c_135805_f592d61a6921b2bf2c224ded25c6f44f.webp&#34;
               width=&#34;760&#34;
               height=&#34;214&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;&lt;/p&gt;
&lt;p&gt;Please noted that for reducing the size of the bag file, 4k resolution images
were stored as the message type &amp;lsquo;sensor_msgs/CompressedImage&amp;rsquo;, which is able to
be visualized in Rviz but not applicable for Autoware calibration toolkit.
Therefore, there is a need to decompress the images stream on the fly.&lt;/p&gt;
&lt;h3 id=&#34;2-intrinsic-calibration-of-the-camera&#34;&gt;2. Intrinsic calibration of the camera&lt;/h3&gt;
&lt;p&gt;Another important issue is that the Autoware camera-LiDAR calibration toolkit requires
the camera&amp;rsquo;s intrinsic information as the input, all the intrinsic parameters
have to be listed follows the Autoware standard. Some camera manufacturers pre-calibrate
the cameras after they were produced and provide default intrinsic matrices.
In our case, we conducted the intrinsic calibration for our FLIR Grasshopper3 camera before
the camera-LiDAR extrinsic calibration. There are a lot of ways to do the intrinsic
calibration for the camera. We chose to use
ROS &lt;a href=&#34;https://wiki.ros.org/camera_calibration&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;camera_calibration&lt;/a&gt; package for its
simplicity and user-friendly interface.&lt;/p&gt;
&lt;p&gt;It is recommended to detach the camera off the vehicle and do intrinsic calibration
in a less-interfered environment (lab or office). In this way, live video stream is
available for calibration. It is possible to adjust the checkerboard properly until
collect enough data, which can promise more precise result than pre-recorded bag file.
The process to configure the FLIR camera ROS driver will be available in another post.
Here provide a screen recording of how we calibrate our FLIR Grasshopper3 camera with
ROS &lt;a href=&#34;https://wiki.ros.org/camera_calibration&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;camera_calibration&lt;/a&gt; package.&lt;/p&gt;









  





&lt;video controls  &gt;
  &lt;source src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/flir_intri_calib.mp4&#34; type=&#34;video/mp4&#34;&gt;
&lt;/video&gt;
&lt;p&gt;Please read carefully of &lt;a href=&#34;https://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;this&lt;/a&gt;
page about the detailed tutorials to calibrate a mono camera.
For the calibration-node-initiation-command in this tutorial, there are three points
that requires extra attention.
First is the &amp;lsquo;&amp;ndash;size&amp;rsquo; flag means the amount of the interior vertex points
instead of the squares. The second is the &amp;lsquo;&amp;ndash;square&amp;rsquo; flag requires the side-length but
area of the square. The third is using the &amp;lsquo;&amp;ndash;no-service-check&amp;rsquo; flag if
there is no corresponding service for the camera.&lt;/p&gt;
&lt;p&gt;&lt;a href=&#34;https://www.roboticlab.eu/claude/bags/flir_intri_calib.bag&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Here&lt;/a&gt; is the bag file
for intrinsic calibration of our camera.
Be careful that this bag file is almost 50 GB because it contains raw 4k images.&lt;/p&gt;
















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig2_hua063a54ca5ac94450620921ef7f8435d_40955_5e0bac655ea22a0991bdf9d0ea25fa04.webp 400w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig2_hua063a54ca5ac94450620921ef7f8435d_40955_c6c1ef4cc252e7426e6f55d30c775770.webp 760w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig2_hua063a54ca5ac94450620921ef7f8435d_40955_1200x1200_fit_q75_h2_lanczos_3.webp 1200w&#34;
               src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig2_hua063a54ca5ac94450620921ef7f8435d_40955_5e0bac655ea22a0991bdf9d0ea25fa04.webp&#34;
               width=&#34;760&#34;
               height=&#34;196&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;ROS camera_calibration package achieves all captured frames for calculation and
intrinsic metrics. The final results were saved in text and yaml formats which
follow the ROS standard. It is needed to change the format when using these metrics
in other applications. &lt;a href=&#34;https://www.roboticlab.eu/claude/sensor_calib/flir_intri_calib_result.tar.gz&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Here&lt;/a&gt;
is the achieved file for our camera&amp;rsquo;s intrinsic calibration, which also was used
in extrinsic calibration with LiDAR sensor. This is the intrinsic calibration
result of our FLIR Grasshopper3 camera.&lt;/p&gt;
















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig3_hu9883a222b243742f371ca6cff69bb2ca_75078_e19bc740ecbee436d83fd8c4e562aa32.webp 400w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig3_hu9883a222b243742f371ca6cff69bb2ca_75078_e6c78f4e4f79795790c244dd4ceb28bb.webp 760w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig3_hu9883a222b243742f371ca6cff69bb2ca_75078_1200x1200_fit_q75_h2_lanczos_3.webp 1200w&#34;
               src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig3_hu9883a222b243742f371ca6cff69bb2ca_75078_e19bc740ecbee436d83fd8c4e562aa32.webp&#34;
               width=&#34;690&#34;
               height=&#34;494&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;h3 id=&#34;3-extrinsic-calibration-of-the-camera-and-lidar&#34;&gt;3. Extrinsic calibration of the camera and LiDAR&lt;/h3&gt;
&lt;p&gt;First is to initiate the &lt;em&gt;roscore&lt;/em&gt; in one terminal.
Although the autoware camera-LiDAR calibration node will also initiate the &lt;em&gt;roscore&lt;/em&gt;,
it is recommended to initiate it standalone to make sure the bag file playing and
images de-compressing commands can be executed.&lt;/p&gt;
&lt;p&gt;Open the second terminal to play the bag file that was provided in Section 1.&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;rosbag play flir_velodyne_calib.bag -r 0.5 -l --pause
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Here set the playing speed to 0.5 for the convenience to pick the image frame.
&amp;lsquo;&amp;ndash;pause&amp;rsquo; flag is compulsory because there is need to pause the image stream to select
the pixel and find corresponding point.&lt;/p&gt;
&lt;p&gt;Next step is decompressing images on the fly. Open a new terminal and use the command:&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;rosrun image_transport republish compressed in:=/front_camera/image_color_rect 
raw out:=/front_camera/image_color_rect
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Please noted that the input topic name in the command must be set as
&amp;lsquo;/front_camera/image_color_rect&amp;rsquo; instead of the actual name of topic in bag file
(&amp;rsquo;/front_camera/image_color_rect/compressed&amp;rsquo;). The flag &amp;lsquo;raw&amp;rsquo; is also compulsory
before the output topic name, otherwise the image streams might be lagging.
It is recommended to visualize and check the decompressed images in Rviz before
initiating the autoware camera-LiDAR calibration node.&lt;/p&gt;
&lt;p&gt;The fourth terminal is to initiate the Rviz visualizer. It is recommended to
pre-configure the window size and other parameters such as topic name, global
fixed frame, view angle, etc. There settings can be saved as an Rviz
configuration file and be reloaded again.
&lt;a href=&#34;https://www.roboticlab.eu/claude/sensor_calib/cam_lidar_cali.rviz&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Here&lt;/a&gt;
provides the Rviz configuration file for our bag file. Use the command:&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;rviz -d $PATH_TO_FILE/cam_lidar_cali.rviz
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;The last terminal is to initiate the Autoware camera-LiDAR calibration node.
Please noted that the autoware.ai must be sourced for this shell, there is also
need to make sure that the camera-lidar calibration package is available, as
shown in this figure.&lt;/p&gt;
















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig4_hu8c18c1acb299f1fbb3fbb5ed53c4d6cc_28755_199f1517c6fdd8a83eb873d148f36ee6.webp 400w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig4_hu8c18c1acb299f1fbb3fbb5ed53c4d6cc_28755_5ba7846191fc5d2d4799a2b451b23279.webp 760w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig4_hu8c18c1acb299f1fbb3fbb5ed53c4d6cc_28755_1200x1200_fit_q75_h2_lanczos_3.webp 1200w&#34;
               src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig4_hu8c18c1acb299f1fbb3fbb5ed53c4d6cc_28755_199f1517c6fdd8a83eb873d148f36ee6.webp&#34;
               width=&#34;760&#34;
               height=&#34;60&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;There is need to specify the camera&amp;rsquo;s intrinsic information yaml file and the
topic name of the image stream. The camera&amp;rsquo;s intrinsic information has to in
autoware format, which can be downloaded
&lt;a href=&#34;https://www.roboticlab.eu/claude/sensor_calib/front_flir_camera_intrinsic_autoware.yaml&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;here&lt;/a&gt;.
The command is:&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;roslaunch autoware_camera_lidar_calibrator camera_lidar_calibration.launch intrinsics_file:=/home/claude/Dev/iseauto_sensor_ws/src/iseauto/config/fromt_camera_lidar_calib/front_flir_camera_intrinsic_autoware.yaml image_src:=/front_camera/image_color_rect
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Now it is able to visualize the point cloud in Rviz and image stream in newly-popped
image viewer. The resolution of our image is 4240x2824. It is not recommended to
scale down the size of the image viewer during the calibration, therefore
high-resolution monitor might be needed, the alternative is configuring two
monitors vertically in computer&amp;rsquo;s display setting to make sure it is always be
able to see the checkerboard.&lt;/p&gt;
&lt;p&gt;The procedure of the calibration is first select a pixel of the checkerboard in
image viewer, then use the &amp;lsquo;Publish Point&amp;rsquo; function in Rviz to select the corresponding
LiDAR point. The coordinates of the selected pixel and point will be shown in
the terminal of the Autoware calibrator node. Repeat this procedure until have
9 pixel-point pairs, then the calibrator will calculate the extrinsic matrices
automatically, the result will be shown in the terminal and saved as the yaml
file in the user&amp;rsquo;s home folder. It is recommended to pick the pixel-point pairs
when checkerboard is at the different locations, to cover the views from left
to right and from close to faraway. This video demonstrates the calibration
procedure. There are only three pixel-point pairs were made in the video because of
the length.&lt;/p&gt;









  





&lt;video controls  &gt;
  &lt;source src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/cam_lidar_calib.mp4&#34; type=&#34;video/mp4&#34;&gt;
&lt;/video&gt;
&lt;p&gt;In practical, there might be need to repeat the whole calibration procedure
several times to get the more precise result.
The camera-LiDAR extrinsic matrices can be verified by projecting
LiDAR points to images.&lt;/p&gt;
&lt;p&gt;&lt;a href=&#34;https://www.roboticlab.eu/claude/sensor_calib/20210920_225652_autoware_lidar_camera_calibration.yaml&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Here&lt;/a&gt;
is the camera and LiDAR extrinsic matrices that were saved by the Autoware calibrator.
These values were used in our &lt;a href=&#34;https://junyigu-claude.com/uploads/object-segmentation-for-autonomous-driving-using-iseauto-data.pdf&#34; target=&#34;_blank&#34;&gt;work&lt;/a&gt; for projecting LiDAR points to camera plane.&lt;/p&gt;
















&lt;figure  &gt;
  &lt;div class=&#34;d-flex justify-content-center&#34;&gt;
    &lt;div class=&#34;w-100&#34; &gt;&lt;img alt=&#34;&#34; srcset=&#34;
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig5_hu58a43cf58f637a424c1f11581e4f92dd_105187_c3bd8fa487d653f648aa00d04243d96f.webp 400w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig5_hu58a43cf58f637a424c1f11581e4f92dd_105187_ad521371a26b811d0486bf313bb7e756.webp 760w,
               /post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig5_hu58a43cf58f637a424c1f11581e4f92dd_105187_1200x1200_fit_q75_h2_lanczos_3.webp 1200w&#34;
               src=&#34;https://junyigu-claude.com/post/using-autoware-to-calibrate-lidar-and-camera-for-iseauto/fig5_hu58a43cf58f637a424c1f11581e4f92dd_105187_c3bd8fa487d653f648aa00d04243d96f.webp&#34;
               width=&#34;760&#34;
               height=&#34;501&#34;
               loading=&#34;lazy&#34; data-zoomable /&gt;&lt;/div&gt;
  &lt;/div&gt;&lt;/figure&gt;
&lt;/div&gt;
</description>
    </item>
    
  </channel>
</rss>
