Note: This tutorial assumes that you have completed the previous tutorials: RGB-D Handheld Mapping With a Kinect.
(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Stereo Handheld Mapping

Description: This tutorial shows how to use rtabmap_ros out-of-the-box with a stereo camera in mapping mode or localization mode.

Tutorial Level: BEGINNER

Next Tutorial: Remote Mapping

Note

For the Zed stereo camera, see RGB-D Mapping tutorial instead as the node publishes already a depth image. However, if you want to use left and right images, skip to Section 2.2 below (images should be already rectified with zed_ros_wrapper).

Bring-up the camera

Here are the main steps to configure your camera to be used with RTAB-Map.

  1. I assume that your stereo camera node publishes these messages and that header.frame_id of the messages is /camera_link:

  2. Your camera should be calibrated, if not, you can do it following this tutorial.

  3. Rectify the images using stereo_image_proc. This will generate these topics:

  4. The messages are in the /camera_link frame (x-axis right, y-axis down and z-axis forward), but RTAB-Map needs them in the /world frame (x-axis forward, y-axis left, z-axis up). To do that, we use a static_transform_publisher to add a transform between /base_link and /camera_link:

    •    <arg name="pi/2" value="1.5707963267948966" />
         <arg name="optical_rotate" value="0 0 0 -$(arg pi/2) 0 -$(arg pi/2)" />
         <node pkg="tf" type="static_transform_publisher" name="camera_base_link"
              args="$(arg optical_rotate) base_link camera_link 100" /> 
  5. We are now ready to start RTAB-Map.

Bring-up example for a Bumblebee2 camera

The camera1394stereo node is used. Note that this node uses /stereo_camera as the camera frame and put the messages in the stereo_camera namespace. The camera was calibrated using this tutorial.

<launch>
  <node pkg="camera1394stereo" type="camera1394stereo_node" name="camera1394stereo_node">
    <param name="video_mode" value="format7_mode3" />
    <param name="format7_color_coding" value="raw16" />
    <param name="bayer_pattern" value="bggr" />
    <param name="bayer_method" value="" />
    <param name="stereo_method" value="Interlaced" />
    <param name="camera_info_url_left" value="" />
    <param name="camera_info_url_right" value="" />
  </node>
    
  <!-- Rotate the camera frame. -->
  <arg name="pi/2" value="1.5707963267948966" />
  <arg name="optical_rotate" value="0 0 0 -$(arg pi/2) 0 -$(arg pi/2)" />
  <node pkg="tf" type="static_transform_publisher" name="camera_base_link"
        args="$(arg optical_rotate) base_link stereo_camera 100" />  
    
  <!-- Use direclty "stereo_camera" namespace instead of remappping each message. -->
  <group ns="/stereo_camera" >   
    <node pkg="stereo_image_proc" type="stereo_image_proc" name="stereo_image_proc"/>
  </group>
</launch>

Bring-up example with Zed/Zed-Mini/Zed2 cameras

  • Make sure you have this pull request to avoid synchronization issues (Updated April 29 2020)

  • Launch:
    • $ roslaunch zed_wrapper zed_no_tf.launch
      # To increase publishing rate from default 15 Hz
      $ rosrun dynamic_reconfigure dynparam set zed_node pub_frame_rate 30
      $ rosrun dynamic_reconfigure dynparam set zed_node pub_frame_rate 30
      # We called two times because it seems the first time it doesn't work.

Bring-up example with MYNT EYE S camera

Issues:

  • There is an issue with right camera info P(0,3) value which is based in mm instead of meters. Just after this line, add:

    • camera_info->P.at(3)/=1000.0;
  • There is an issue with IMU frame (if you want to use IMU), the quaternion here should be 0 0 -0.7071068 0.7071068 instead of 0 0 0 1.

Launch:

  • $ roslaunch mynt_eye_ros_wrapper mynteye.launch
    $ rosrun tf static_transform_publisher 0 0 0 -1.5707963267948966 0 -1.5707963267948966  base_link mynteye_link 100 
    
    // verify baseline is correct:
    $ rostopic echo /mynteye/right_rect/camera_info/P 
    [367.96412028336476, 0.0, 363.54740142822266, -44.33232450213832, 0.0, 367.96412028336476, 215.1128387451172, 0.0, 0.0, 0.0, 1.0, 0.0]
    // The fourth value should be around -44, not -44332!
    
    // To use IMU:
    $ rosrun imu_filter_madgwick imu_filter_node \
        _use_mag:=false \
        _publish_tf:=false \
        _world_frame:="enu" \
        /imu/data_raw:=/mynteye/imu/data_raw \
        /imu/data:=/rtabmap/imu

Bring-up example with RealSense cameras

With D400 cameras, it is recommended to disable IR emitter after starting them below, otherwise visual odometry will get confused with the IR points not moving from the projector:

rosrun rqt_reconfigure rqt_reconfigure
# select Camera->stereo_module, set emitter_enabled to OFF
  • D415/D435:
    • $ roslaunch realsense2_camera rs_camera.launch enable_infra1:=true enable_infra2:=true
  • D435i:
    • # ISSUE: Use unite_imu_method:="copy" if imu topics keep stopping
      $ roslaunch realsense2_camera rs_camera.launch enable_infra1:=true enable_infra2:=true unite_imu_method:=linear_interpolation enable_gyro:=true enable_accel:=true enable_sync:=true
      
      $ rosrun imu_filter_madgwick imu_filter_node \
          _use_mag:=false \
          _publish_tf:=false \
          _world_frame:="enu" \
          /imu/data_raw:=/camera/imu \
          /imu/data:=/rtabmap/imu
  • T265, see this post.

Bring-up example with DepthAI OAK-D

  • Stereo IR Mode:
    • $ roslaunch depthai_examples stereo_inertial_node.launch depth_aligned:=false
      
      # Estimate quaternion of the IMU data
      $ rosrun imu_filter_madgwick imu_filter_node \
         imu/data_raw:=/stereo_inertial_publisher/imu \
         imu/data:=/stereo_inertial_publisher/imu/data  \
         _use_mag:=false \
         _publish_tf:=false

Mapping mode

There are two choices for online visualization: rtabmapviz or rviz. With the RVIZ plugin rtabmap/MapCloud, the clouds can be incrementally added to RVIZ. However using rtabmapviz, the interface should look like the tutorials of the standalone version. Following the launch file of the Bumblebee bring-up example above:

  • BumbleBee2:

     $ roslaunch rtabmap_ros stereo_mapping.launch \
       stereo_namespace:="/stereo_camera" \
       rtabmap_args:="--delete_db_on_start" \
       rviz:=false \
       rtabmapviz:=true
  • Zed camera (increase disparity range):

     $ roslaunch rtabmap_ros rtabmap.launch \
       rtabmap_args:="--delete_db_on_start --Vis/CorFlowMaxLevel 5 --Stereo/MaxDisparity 200" \
       stereo_namespace:=/zed_node \
       right_image_topic:=/zed_node/right/image_rect_color \
       stereo:=true \
       frame_id:=base_link \
       imu_topic:=/zed_node/imu/data \
       wait_imu_to_init:=true
       # Remove wait_imu_to_init if you don't have a zed with IMU.
  • MYNT EYE S:

    roslaunch rtabmap_ros rtabmap.launch \
       stereo:=true \
       left_image_topic:=/mynteye/left_rect/image_rect \
       right_image_topic:=/mynteye/right_rect/image_rect \
       left_camera_info_topic:=/mynteye/left_rect/camera_info \
       right_camera_info_topic:=/mynteye/right_rect/camera_info \
       frame_id:=base_link \
       rtabmap_args:="-d" \
       wait_imu_to_init:=true \
       imu_topic:=/rtabmap/imu
    // Remove wait_imu_to_init and imu_topic if IMU is not used.
  • RealSense D415/D435 cameras:

     $ roslaunch rtabmap_ros rtabmap.launch \
       rtabmap_args:="--delete_db_on_start" \
       left_image_topic:=/camera/infra1/image_rect_raw \
       right_image_topic:=/camera/infra2/image_rect_raw \
       left_camera_info_topic:=/camera/infra1/camera_info \
       right_camera_info_topic:=/camera/infra2/camera_info \
       stereo:=true
  • RealSense D435i camera (stereo IR):

     $ roslaunch rtabmap_ros rtabmap.launch \
       rtabmap_args:="--delete_db_on_start" \
       left_image_topic:=/camera/infra1/image_rect_raw \
       right_image_topic:=/camera/infra2/image_rect_raw \
       left_camera_info_topic:=/camera/infra1/camera_info \
       right_camera_info_topic:=/camera/infra2/camera_info \
       stereo:=true \
       wait_imu_to_init:=true \
       imu_topic:=/rtabmap/imu
  • RealSense D435i camera (IR-Depth):

     $ roslaunch rtabmap_ros rtabmap.launch \
       rtabmap_args:="--delete_db_on_start" \
       rgb_topic:=/camera/infra1/image_rect_raw \
       depth_topic:=/camera/depth/image_rect_raw \
       camera_info_topic:=/camera/infra1/camera_info \
       wait_imu_to_init:=true \
       imu_topic:=/rtabmap/imu
  • DepthAI OAK-D camera (Stereo IR):

     $ roslaunch rtabmap_ros rtabmap.launch \
        args:="--delete_db_on_start" \
        stereo:=true \
        left_image_topic:=/stereo_inertial_publisher/left/image_rect \
        right_image_topic:=/stereo_inertial_publisher/right/image_rect \
        left_camera_info_topic:=/stereo_inertial_publisher/left/camera_info \
        right_camera_info_topic:=/stereo_inertial_publisher/right/camera_info \
        imu_topic:=/stereo_inertial_publisher/imu/data \
        frame_id:=oak-d_frame \
        approx_sync:=true \
        approx_sync_max_interval:=0.001 \
        wait_imu_to_init:=true

The rtabmap argument "--delete_db_on_start" is used to start mapping from a clean database. You can also reset the memory from rtabmapviz using the action "Edit->Delete memory", or by using this service :

  •  $ rosservice call /rtabmap/reset

  • If the background of the map turns RED, it is because the odometry is lost. Visit this page under Lost Odometry (RED screens!) section to know why this happens! To track again the odometry, replace the camera to the last position before the RED appeared and it should be able to recompute the odometry. Another way is to reset the odometry using the menu option "Edit->Reset odometry" (note that a new map is created each time the odometry is reset).

Not synchronized stereo messages

If your camera messages are not synchronized (e.g., the left and right images don't have exactly the same timestamp header.stamp), you should add this argument: "approximate_sync:=true". However, stereo with not exactly synchronized images is bad, so it would better to check on the camera driver node so that it sends left and right images with the same timestamp.

Use external odometry

If you want to use external odometry (e.g., you want to use odometry from your robot or another stereo odometry approaches like viso2_ros or fovis_ros), you can set these arguments: "visual_odometry:=false odom_topic:=/my_odometry".

Where is the map saved?

~/.ros/rtabmap.db

RTAB-Map provides a tool to browse data in the database:

$ rtabmap-databaseViewer ~/.ros/rtabmap.db

Localization mode

Switching online from Mapping to Localization

In localization mode, a map large enough (>30 locations) must be already created (using stereo_mapping.launch above).

In rtabmapviz (GUI), click on "Localization" in the "Detection" menu. By looking over locations in the map, RTAB-Map would localize itself in it by finding a loop closure. Once a loop closure is found, the odometry is corrected and the current cloud will be aligned to the map.

You can play with the relocation behaviour by resetting the odometry. You can use form the GUI the Detection->"Reset odometry" action or call the service on the terminal like:

  • $ rosservice call /rtabmap/reset_odom

Starting in Localization mode from a saved map

If you followed the steps in Mapping mode section above, just kill stereo_mapping.launch terminal, then relaunch it with localization:=true parameter and without --delete_db_on_start argument. This will reuse the map saved by default in ~/.ros/rtabmap.db:

  • Launch (rtabmapviz): stereo_mapping.launch

     $ roslaunch rtabmap_ros stereo_mapping.launch \
       stereo_namespace:="/stereo_camera" \
       localization:=true
    • Zed camera (increase disparity range):

       $ rosrun tf static_transform_publisher 0 0 0 -1.5707963267948966 0 -1.5707963267948966 camera_link zed_initial_frame 100
       $ rosrun tf static_transform_publisher 0 0 0 0 0 0 zed_current_frame ZED_left_camera  100
       $ roslaunch rtabmap_ros stereo_mapping.launch \
         stereo_namespace:=/camera \
         right_image_topic:=/camera/right/image_rect_color \
         frame_id:=camera_link \
         rtabmap_args:=" --Vis/CorFlowMaxLevel 5 --Stereo/MaxDisparity 200" \
         localization:=true
  • Launch (rviz):
     $ roslaunch rtabmap_ros stereo_mapping.launch \
       stereo_namespace:="/stereo_camera" \
       rviz:=true \
       rtabmapviz:=false \
       localization:=true

In the GUI, do Edit->"Download map" to download the map from the rtabmap node. At the start, you should see that the current displayed cloud is not aligned to the map, it is normal because the odometry is reset on each start. By looking over locations in the map, RTAB-Map would localize itself in it by finding a loop closure. Once a loop closure is found, the odometry is corrected and the map will appear with the current cloud aligned to it.

Coordinate frames

RTAB-Map follows standard from REP 105 for coordinate frames. At the initialization, the origin of coordinate frame "odom" is set to the location where the camera is at that moment. The TF odom->camera_link will always be with respect to this origin. Once RTAB-Map has relocalized itself in the prebuilt map, the tf map->odom will change from an identity transform to the transform from the prebuilt map's origin to the new odom coordinate system.

Wiki: rtabmap_ros/TutorialsOldInterface/StereoHandHeldMapping (last edited 2023-04-19 04:39:54 by MathieuLabbe)