#################################### ##FILL ME IN #################################### ## for a custom note with links: ## note = ## for the canned note of "This tutorial assumes that you have completed the ## previous tutorials:" just add the links ## note.0 = [[http://wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping|RGB-D Hand-Held Mapping With a Kinect]] ## descriptive title for the tutorial ## title = Setup RTAB-Map on Your Robot! ## multi-line description to be displayed in search ## description = This tutorial shows multiple RTAB-Map configurations that can be used on your robot. ## the next tutorial description (optional) ## next = ## links to next tutorial (optional) ## next.0.link= [[http://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot|Mapping and Navigation with Turtlebot]] ## next.1.link= ## what level user is this tutorial for ## level= IntermediateCategory ## keywords = #################################### <> <> == Introduction == The robot must be equipped at least with a Kinect-like sensor. I '''recommend highly''' to calibrate your Kinect-like sensor following [[openni_launch/Tutorials/IntrinsicCalibration|this guide]]. If you want to use a 2D laser, the Kinect's clouds must be aligned with the laser scans. I will present in the next sections some possible configurations depending on the robot with example launch files. '''Recommended robot configuration''': * A 2D laser which outputs <> messages. * Odometry (IMU, wheel encoders, ...) which outputs <> message. * A calibrated Kinect-like sensor compatible with [[openni_launch]], [[openni2_launch]] or [[freenect_launch]] ros packages. These examples are based on what I did for [[https://introlab.3it.usherbrooke.ca/mediawiki-introlab/index.php/AZIMUT|AZIMUT3]]. The robot is equipped with a Kinect, an URG-04LX and odometry is provided using the wheel encoders. {{attachment:azimut.png|AZIMUT3|width="400"}} == Bring-up your robot == * If your robot has odometry: the robot should publish his odometry in <> format. * If you have a laser: launch the laser node, it should publish <> messages. Here we use [[hokuyo_node]] node. * Launch OpenNI with '''depth_registration:=true''': {{{ $ roslaunch openni_launch openni.launch depth_registration:=true }}} * TF transforms must be provided (through using robot's [[robot_state_publisher|URDF]] or manually in a file using [[tf#static_transform_publisher|static transform publishers]]). The main useful transforms are "/odom", "/base_link", "/base_laser_link" and "/camera_link". === Kinect + Odometry + 2D laser === In this configuration, I assume that you have a wheeled robot constrained to the plane XY with only rotation on `yaw` (around z-axis). {{attachment:setupA.png|Kinect + Odometry + 2D laser|width="800"}} {{{ }}} Let's break down this launch file: {{{ }}} For convenience, we put [[rtabmap_slam#rtabmap|rtabmap]] node in rtabmap namespace. [[rtabmap_slam#rtabmap|rtabmap]] node provides services which can conflict with other services from other nodes. {{{ }}} In this example, because [[rtabmap_slam#rtabmap|rtabmap]] node synchronizes topics coming from different sensors, we use [[rtabmap_sync#rtabmap_sync.2Frgbd_sync|rgbd_sync]] nodelet to make sure that our image topics are correctly synchronized together before feeding them to [[rtabmap_slam#rtabmap|rtabmap]]. The output `rgbd_image` is a [[http://docs.ros.org/api/rtabmap_msgs/html/msg/RGBDImage.html|rtabmap_msgs/RGBDImage]] message. The parameter `approx_sync` should be true when camera topics are not already synchronized by the camera node like here for freenect or openni2 drivers for Kinect For Xbox 360. `approx_sync` should be false with camera drivers for Kinect v2, ZED or realsense as they publish rgb and depth topics already synchronized (same stamp). {{{ }}} The `--delete_db_on_start` argument will make [[rtabmap_slam#rtabmap|rtabmap]] to delete the database (default located in `~/.ros/rtabmap.db`) when starting. If you want the robot to continue mapping from a previous mapping session, you should remove `--delete_db_on_start`. {{{ }}} The "frame_id" should be a fixed frame on the robot. {{{ }}} By default, `subscribe_depth` is true. However, in this setup, we will use RGB-D image input instead, so `subscribe_depth` is set to false and `subscribe_rgbd` is set to true. Since we have a 2D lidar, set `subscribe_scan` to true. If we have a 3D lidar publishing [[http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html|sensor_msgs/PointCloud2]] messages, set `subscribe_scan_cloud` to true instead and remap corresponding `scan_cloud` topic instead of `scan`. * When `subscribe_rgbd=true`, `rgbd_image` input topic should be set. * When `subscribe_scan=true`, `scan` input topics should be set. {{{ }}} Set the required input topics. Note that `rgbd_image` doesn't have leading slash, which means it subscribe to `rgbd_image` in its namespace, which would be `/rtabmap/rgbd_image` in this case. {{{ }}} Used for synchronization of the input topics above. The [[rtabmap_slam#rtabmap|rtabmap]] node synchronizes `/base_controller/odom`, `/base_scan` and `/rtabmap/rgbd_image` in a single callback. Higher the value, more flexible can be the rate used for each topic. On AZIMUT3, `/base_controller/odom` is published at 50 Hz, `/base_scan` at 10 Hz, and the images at 30 Hz. {{{ }}} This section sets the RTAB-Map's parameters (the same as in Preferences dialog in the standalone version). To know all RTAB-Map's parameters that can be set with some descriptions, execute this command: {{{ $ rosrun rtabmap_slam rtabmap --params }}} Here is a brief overview of the main parameters set here: * `RGBD/NeighborLinkRefining`: Correct odometry using the input lidar topic using ICP. * `RGBD/ProximityBySpace`: Find local loop closures based on the robot position in the map. It is useful when the robot, for example, is coming back in the opposite direction. With camera facing back, global loop closures cannot be found. So using the position and previously added laser scans to the map, we find the transform using ICP. * `RGBD/AngularUpdate`: The robot should move to update the map (if not 0). * `RGBD/LinearUpdate`: The robot should move to update the map (if not 0). * `RGBD/OptimizeFromGraphEnd`: By setting to false (which is the default), on loop closures the graph will be optimized from the first pose in the map. The TF `/map -> /odom` will change when this happens. When set to false, the graph is optimized from the latest node added to the map instead of the first. By optimizing from the last, the last pose keeps its value and all the previous poses are corrected according to it (so `/odom` and `/map` will always match together). * `Grid/FromDepth`: If true, the occupancy grid is created from the cloud generated by the depth camera. If false, the occupancy grid is created from the laser scans. * `Reg/Force3DoF`: Force 3DoF registration: `roll`, `pitch` and `z` won't be estimated. * `Reg/Strategy`: We chose ICP to refine global loop closures found with ICP using the laser scans. * `Icp/VoxelSize`: Scans are filtered down to voxel of 5 cm before doing ICP. * `Icp/MaxCorrespondenceDistance`: Maximum distance between points during registration by ICP. {{{ }}} Close the group. === Kinect + Odometry + Fake 2D laser from Kinect === {{attachment:setupA2.jpg|Kinect + Odometry + Fake 2D laser from Kinect|width="800"}} If you don't have a laser and you want to create a 2D occupancy grid map (for which laser scans are required), you can simulate a 2D laser with the depth image from the Kinect using [[depthimage_to_laserscan]] ros-pkg. This was the configuration used for the [[https://github.com/introlab/rtabmap/wiki/IROS-2014-Kinect-Challenge|IROS 2014 Kinect Contest]]. This could be also used with a robot like the [[http://www.turtlebot.com/|TurtleBot]]. Unlike with a real lidar on example above, I don't recommend setting `Reg/Strategy` to 1 (for ICP) because the field of view of the camera is too small to have good ICP registrations. {{{ }}} === Kinect + 2D laser === {{attachment:setupB.png|Kinect + 2D laser|width="800"}} In you don't have odometry, you can create one using the 2D laser scans and ICP. Once you have this new odometry node, you can do the same steps as [[http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot#Kinect_.2B-_Odometry_.2B-_2D_laser|above]] (with odometry). To generate odometry from laser scans, take a look at these packages: [[laser_scan_matcher]] or [[hector_slam]]. '''NEW''' RTAB-Map has now its own `icp_odometry` node. Here is an example on how [[hector_slam]] can be integrated with [[rtabmap_slam#rtabmap|rtabmap]]: [[https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_demos/launch/demo_hector_mapping.launch|demo_hector_mapping.launch]]. you can try it with this bag: [[https://docs.google.com/uc?id=0B46akLGdg-uadXhLeURiMTBQU28&export=download|demo_mapping.bag]] (you should remove the "/odom" tf from the bag first) : {{{ $ rosbag filter demo_mapping.bag demo_mapping_no_odom.bag 'topic != "/tf" or topic == "/tf" and m.transforms[0].header.frame_id != "/odom"' $ roslaunch rtabmap_demos demo_hector_mapping.launch hector:=true $ rosbag play --clock demo_mapping_no_odom.bag }}} For an example using `rtabmap_odom/icp_odometry`, set `hector` argument to false above. For the best results, build rtabmap with [[https://github.com/ethz-asl/libpointmatcher|libpointmatcher]] dependency. === 2D laser only === {{attachment:setupLaserOnly.png|2D laser|width="800"}} This is the same example than [[http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot#Kinect_.2B-_2D_laser|above]] with Kinect+Laser but without a camera. If you have only a lidar without wheel odometry, you could try this setup. Obviously without a camera, you lose the ability to detect global loop closures or globally localize using vision. If odometry doesn't drift too much, proximity detections can still be detected to correct odometry over time. We will again use [[https://docs.google.com/uc?id=0B46akLGdg-uadXhLeURiMTBQU28&export=download|demo_mapping.bag]] without wheel odometry and same launch file ([[https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_demos/launch/demo_hector_mapping.launch|demo_hector_mapping.launch]]) but with argument "camera:=false" to disable camera stuff: {{{ $ rosbag filter demo_mapping.bag demo_mapping_no_odom.bag 'topic != "/tf" or topic == "/tf" and m.transforms[0].header.frame_id != "/odom"' $ roslaunch rtabmap_demos demo_hector_mapping.launch hector:=true camera:=false $ rosbag play --clock demo_mapping_no_odom.bag }}} For an example using `rtabmap_odom/icp_odometry`, set `hector` argument to false above. For the best results, build rtabmap with [[https://github.com/ethz-asl/libpointmatcher|libpointmatcher]] dependency. === 2D laser + Wheel Odometry as guess === {{attachment:setupLaserWheelOdometry.png|2D laser and wheel odometry as guess|width="800"}} This is the same example than [[http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot#A2D_laser_only|above]] with Laser only but here we use wheel odometry as guess for icp_odometry. This will make icp_odometry more robust to corridor-like environments with short-range lidars. For this example, we will use the orignal [[https://docs.google.com/uc?id=0B46akLGdg-uadXhLeURiMTBQU28&export=download|demo_mapping.bag]] with wheel odometry. Here we should set "hector:=false" and "odom_guess:=true" with launch file [[https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_demos/launch/demo_hector_mapping.launch|demo_hector_mapping.launch]]: {{{ $ roslaunch rtabmap_demos demo_hector_mapping.launch hector:=false camera:=false odom_guess:=true $ rosbag play --clock demo_mapping.bag }}} Obviously without a camera, you lose the ability to detect global loop closures or globally localize using vision. If odometry doesn't drift too much, proximity detections can still be detected to correct odometry over time. The same example can be ran with a camera for comparison by setting "camera:=true". For the best results, build rtabmap with [[https://github.com/ethz-asl/libpointmatcher|libpointmatcher]] dependency. === Kinect + Odometry === {{attachment:setupC.png|Kinect + Odometry|width="800"}} In this configuration, I assume that you have a robot not constrained to a single plane (like UAV), which can move in `XYZ` and `roll`, `pitch`, `yaw` rotations. {{{ }}} === Kinect === {{attachment:setupD.png|Kinect|width="800"}} If you can't have a reliable odometry, you can map using only RTAB-Map at the cost of "lost odometry" (like the [[https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lostodometry|RED screens in the standalone version]]). The robot may detect this "lost" state when a '''null''' odometry message is sent by the [[rtabmap_odom#rgbd_odometry|rgbd_odometry]] node. {{{ Note that we set `approx_sync` to false for `rtabmap` node to make sure it uses exactly the `odom` computed with the same `rgbd_image` topic. }}} === Stereo A === {{attachment:setupE.png|Stereo A|width="800"}} RTAB-Map can be also used with a stereo camera. As shown in the picture above, you will need to install [[viso2_ros]] and the [[stereo_image_proc]] nodes. {{{ Note that we set `approx_sync` to false for `rtabmap` node to make sure it uses exactly the `odom` computed with the same image topics. }}} === Stereo B === {{attachment:setupE2.png|Stereo B|width="800"}} The page [[rtabmap_ros/Tutorials/StereoOutdoorMapping|StereoOutdoorMapping]] shows a working demonstration that you can try with the provided rosbag. For actual mapping with your stereo camera, you camera driver should publish image messages like these (the camera namespace can be different than `stereo_camera`): {{{ $ rostopic list /stereo_camera/left/image_raw /stereo_camera/left/camera_info /stereo_camera/right/image_raw /stereo_camera/right/camera_info }}} and assuming that the camera driver provides TF `stereo_camera` and that left and right images are synchronized (note the `approx_sync` set to false), the corresponding launch file could be like this: {{{ }}} For visualization, I recommend to try the [[rtabmap_ros/Tutorials/StereoOutdoorMapping|stereo outdoor mapping]] tutorial to see what is going on with [[rviz]]. To use [[rtabmap_viz#rtabmap_viz|rtabmap_viz]], you can add the node under `rtabmap` namespace above: {{{ }}} == Navigation == The [[rtabmap_slam#rtabmap|rtabmap]] node uses the laser scans to create a 2D occupancy grid map that can be used by a planner (see `grid_map` topic). If you don't have laser scans, you can create with [[rtabmap_slam#rtabmap|rtabmap]] node with `proj_map` topic a 2D occupancy grid map from the projection of the Kinect or Stereo point clouds on the ground. Visit the [[rtabmap_ros/Tutorials/StereoOutdoorNavigation|StereoOutdoorNavigation]] page for an example of creating such maps. == Remote visualization == === rtabmap_viz === To make rtabmap_viz easily communicate with rtabmap node, launch it in the same namespace than rtabmap, so that all topics and services can be directly connected without remappings. {{{ $ export ROS_NAMESPACE=rtabmap $ rosrun rtabmap_viz rtabmap_viz _frame_id:=base_link }}} === rviz === When RTAB-Map's ros-pkg is built, the [[rtabmap_rviz_plugins#Map_Cloud_Display|rtabmap_rviz_plugins/MapCloud]] plugin can be selected in RVIZ for visualization of the constructed 3D map cloud. {{{ $ rosrun rviz rviz }}} == Remote visualization: bandwidth efficiency with RVIZ == === rviz === {{attachment:bandwidthRviz.png|rviz|width="800"}} Use the following for rviz: {{{ }}} [[rtabmap_util#point_cloud_xyzrgb|rtabmap_util/point_cloud_xyzrgb]] nodelet creates a point cloud from the RGB and depth images, with optional voxel size (0=disabled). == Remote mapping == {{attachment:remoteMapping.png|Remote mapping|width="800"}} '''EDIT''' (February 4 2016): There is now a simple tutorial about remote mapping with a Kinect here: http://wiki.ros.org/rtabmap_ros/Tutorials/RemoteMapping == Switching between Mapping and Localization == It can be convenient after mapping an area to put rtabmap in localization mode to avoid increasing the map size in already mapped areas. If you are using [[rtabmap_viz#rtabmap_viz|rtabmap_viz]], there are already buttons on the interface: {{attachment:mapping_localization_buttons.png|mapping_localization_buttons}} Otherwise, you can call the [[rtabmap_slam#rtabmap|set_mode_mapping]] and [[rtabmap_slam#rtabmap|set_mode_localization]] services. {{{ $ rosservice call rtabmap/set_mode_localization $ rosservice call rtabmap/set_mode_mapping }}}