Overview

The ZED ROS wrapper provides access to all camera sensors and parameters through ROS topics, parameters and services. The zed-ros-wrapper is available for all ZED stereo cameras: ZED2, ZED Mini and ZED.

https://cdn.stereolabs.com/assets/images/zed-2/zed-2-front.jpg

Getting Started

Follow the official Stereolabs "Getting Started" guide to install the ZED ROS wrapper with all the latest features.

Github repositories

  • zed-ros-wrapper: this is the main repository. It contains the source code of the ZED Wrapper node and the instruction about how to compile and run it.

  • zed-ros-examples: this repository is a collection of examples and tutorials to illustrate how to better use the ZED cameras in the ROS framework.

Start the node

The ZED Wrapper node with the default parameters can be started using one of the three launch files, according to the model of camera:

  • ZED: $ roslaunch zed_wrapped zed.launch

  • ZED Mini: $ roslaunch zed_wrapped zedm.launch

  • ZED 2: $ roslaunch zed_wrapped zed2.launch

Published Topics

Left Camera

rgb/image_rect_color (sensor_msgs/Image)

  • Color rectified image (left RGB image by default).
rgb/image_rect_gray (sensor_msgs/Image)
  • Grayscale rectified image (left RGB image by default).
rgb/camera_info (sensor_msgs/CameraInfo)
  • Color camera calibration data.
rgb_raw/image_raw_color (sensor_msgs/Image)
  • Color unrectified image (left RGB image by default).
rgb_raw/image_raw_gray (sensor_msgs/Image)
  • Grayscale unrectified image (left RGB image by default).
rgb_raw/camera_info (sensor_msgs/CameraInfo)
  • Color unrectified camera calibration data.
left/image_rect_color (sensor_msgs/CameraInfo)
  • Left camera color rectified image.
left/image_rect_gray (sensor_msgs/CameraInfo)
  • Left camera grayscale rectified image.
left/camera_info (sensor_msgs/CameraInfo)
  • Left camera calibration data.
left_raw/image_raw_color (sensor_msgs/CameraInfo)
  • Left camera color unrectified image.
left_raw/image_raw_gray (sensor_msgs/CameraInfo)
  • Left camera grayscale unrectified image.
left_raw/camera_info (sensor_msgs/CameraInfo)
  • Left unrectified camera calibration data.

Right Camera

right/image_rect_color (sensor_msgs/Image)

  • Color rectified right image.
right/image_rect_gray (sensor_msgs/Image)
  • Grayscale rectified right image.
right/camera_info (sensor_msgs/CameraInfo)
  • Right camera calibration data.
right_raw/image_raw_color (sensor_msgs/Image)
  • Color unrectified right image.
right_raw/image_raw_gray (sensor_msgs/Image)
  • Grayscale unrectified right image.
right_raw/camera_info (sensor_msgs/)
  • Right unrectified camera calibration data.

Sensors

imu/data (sensor_msgs/Imu)

  • Accelerometer, gyroscope, and orientation data in Earth frame [only ZED-M and ZED 2].
imu/data_raw (sensor_msgs/Imu)
  • Accelerometer and gyroscope data in Earth frame [only ZED-M and ZED 2].
imu/mag (sensor_msgs/MagneticField)
  • Calibrated magnetometer data [only ZED 2].
atm_press (sensor_msgs/FluidPressure)
  • Atmospheric pressure data [only ZED 2].
temperature/imu (sensor_msgs/Temperature)
  • Temperature of the IMU sensor [only ZED 2].
temperature/left (sensor_msgs/Temperature)
  • Temperature of the left camera sensor [only ZED 2].
temperature/right (sensor_msgs/Temperature)
  • Temperature of the right camera sensor [only ZED 2].
left_cam_imu_transform (geometry_msgs/Transform)
  • Transform from left camera to IMU sensor position.

Stereo Pair

stereo/image_rect_color (sensor_msgs/Image)

  • Stereo rectified pair images side-by-side.
stereo_raw/image_raw_color (sensor_msgs/Image)
  • stereo unrectified pair images side-by-side.

to retrieve the camera parameters you can subscribe to the topics left/camera_info, right/camera_info, left_raw/camera_info and right_raw/camera_info

Depth and Point Cloud

depth/depth_registered (sensor_msgs/Image)

  • Depth map image registered on left image (32-bit float in meters by default).
depth/camera_info (sensor_msgs/CameraInfo)
  • Depth camera calibration data.
point_cloud/cloud_registered (sensor_msgs/PointCloud2)
  • Registered color point cloud.
confidence/confidence_map (sensor_msgs/Image)
  • Confidence map (floating point values to be used in your own algorithms).
disparity/disparity_image (stereo_msgs/DisparityImage)
  • Disparity map.

Positional Tracking

odom (nav_msgs/Odometry)

  • Camera position and orientation in free space relative the Odometry frame (pure visual odometry for ZED, visual-intertial for ZED 2 and ZED-M).
pose (geometry_msgs/PoseStamped)
  • Camera position and orientation in free space relative the Map frame (given by sensor fusion + SLAM + loop closure algorithm).
pose_with_covariance (geometry_msgs/PoseWithCovarianceStamped)
  • Camera pose relative to Map frame with covariance.
path_odom (nav_msgs/Path)
  • Trajectory of the camera in Map frame without loop closures (continuous).
path_map (nav_msgs/Path)
  • Trajectory of the camera in Map frame with loop closures (may contain jumps).

Mapping

mapping/fused_cloud (sensor_msgs/PointCloud2)

  • 3D point cloud map generated from fusion of point cloud over the camera trajectory.

  • Published only if mapping is enabled, see mapping/mapping_enabled parameter and start_3d_mapping service.

Object Detection [only ZED 2]

objects (zed_interfaces/object_stamped)

  • Array of the detected/tracked objects for each camera frame [only ZED 2].
object_markers (visualization_msgs/MarkerArray)
  • Array of markers of the detected/tracked objects to be rendered in Rviz [only ZED 2].

  • Published only if Object Detection is enabled, see object_detection/od_enabled parameter and start_object_detection service.

Diagnostic

diagnostics (diagnostic_msgs/DiagnosticStatus)

  • ROS diagnostic message for ZED cameras.

Node Parameters

You can specify the parameters to be used by the ZED node modifying the values in the files

  • params/common.yaml: common parameters to all camera models

  • params/zed.yaml: parameters for the ZED camera

  • params/zedm.yaml: parameters for the ZED Mini camera

  • params/zed2.yaml: parameters for the ZED 2 camera

General Parameters

general/camera_name (string, default: "zed")

  • A custom name for the ZED camera. Used as namespace and prefix for camera TF frames
general/camera_model (string, default: "zed")
  • Type of Stereolabs camera. Used to load the correct model in RVIZ example and to start the correct SDK modules.
general/zed_id (int, default: 0)
  • Select a ZED camera by its ID. IDs are assigned by Ubuntu. Useful when multiple cameras are connected. ID is ignored if an SVO path is specified
general/serial_number (int, default: 0)
  • Select a ZED camera by its Serial Number
general/resolution (int, default: 2)
  • Set ZED camera resolution [0: HD2K, 1: HD1080, 2: HD720, 3: VGA]
general/grab_frame_rate (int, default: 30)
  • Set ZED camera video framerate
general/gpu_id (int, default: -1)
  • Select a GPU device for depth computation
general/base_frame (string, default: "base_link")
  • Frame_id of the frame that indicates the reference base of the robot
general/verbose (bool, default: false)
  • Enable/disable the verbosity of the SDK
general/svo_compression (int, default: 2)
  • Set SVO compression mode for saving [0: LOSSLESS (PNG/ZSTD), 1: H264 (AVCHD) ,2: H265 (HEVC)]
general/self_calib (bool, default: true)
  • Enable/disable self calibration at starting
general/camera_flip (bool, default: false)
  • Flip the camera data if it is mounted upsidedown

Video Parameters

video/img_downsample_factor (double, default: 1.0)

  • Resample factor for images [0.01,1.0]. The SDK works with native image sizes, but publishes rescaled image.
video/extrinsic_in_camera_frame (bool, default: true)
  • if false extrinsic parameter in camera_info will use ROS native frame (X FORWARD, Z UP) instead of the camera frame (Z FORWARD, Y DOWN) [true use old behavior as for version < v3.1]

Sensor Parameters [only ZED-M and ZED 2]

sensors/sensors_timestamp_sync (bool, default: false)

  • Synchronize Sensors message timestamp with latest received frame

Depth Parameters

depth/quality (int, default: 1)

  • Select depth map quality [0: NONE, 1: PERFORMANCE, 2: MEDIUM, 3: QUALITY, 4: ULTRA]
depth/sensing_mode (int, default: 0)
  • Select depth sensing mode (change only for VR/AR applications) [0: STANDARD, 1: FILL]
depth/depth_stabilization (bool, default: true)
  • Enable depth stabilization. Stabilizing the depth requires an additional computation load as it enables tracking
depth/openni_depth_mode (int, default: 0)
  • Convert 32bit depth in meters to 16bit in millimeters [0: 32bit float meters, 1: 16bit uchar millimeters]
depth/min_depth (float, default: 0.7)
  • Minimum value allowed for depth measures. Min: 0.3 (ZED), 0.1 (ZED-M) 0.2 (ZED 2), Max: 3.0 - Note: reducing this value will require more computational power and RAM memory. In cases of limited computation power, increasing this value can provide better performance.
depth/max_depth (float, default: 10.0)
  • Maximum value allowed for depth measures [Min: 1.0, Max: 30.0 - Values beyond this limit will be reported as TOO_FAR]
depth/depth_downsample_factor (double, default: 1.0)
  • Resample factor for depth data matrices [0.01,1.0]. The SDK works with native data sizes, but publishes rescaled matrices (depth map, point cloud, ...)

Position Parameters

pos_tracking/publish_tf (bool, default: true)

  • Enable/disable publishing of the transforms between odom â†’ base_link and imu_link â†’ base_link. Note: The IMU frame can be published at a higher frequency than odom and map.
pos_tracking/publish_map_tf (bool, default: true)
  • Enable/disable publishing of the map TF frame relative to the position of the base frame (i.e. base_link) in the map frame. Note: the value is ignored if publish_tf is false.
pos_tracking/map_frame (string, default: "map")
  • Frame_id of the pose message
pos_tracking/odometry_frame (string, default: "odom")
  • Frame_id of the odom message
pos_tracking/area_memory (bool, default: true)
  • Enable area learning to correct odometry drift with loop closures.
pos_tracking/pose_smoothing (bool, default: true)
  • Enable smoothing of pose correction when drift is detected with loop closures. Only available if area_memory is enabled.
pos_tracking/area_memory_db_path (string, default: "")
  • Path of the database file for loop closure and relocalization that contains learnt visual information about the environment.
pos_tracking/floor_alignment (bool, default: false)
  • If true, it detects the floor plane during position tracking initialization. The initial pose of zed_camera_center in map frame is estimated relative to the floor plane and the initial pose of base_link is recomputed according to the TF tree.
pos_tracking/initial_base_pose (array, default: [0.0,0.0,0.0, 0.0,0.0,0.0])
  • Set the the initial pose of base_linkin map frame -> [X, Y, Z, R, P, Y]
pos_tracking/init_odom_with_first_valid_pose (bool, default: true)
  • Set the initial pose of the odometry with the first valid pose received from positional tracking. Format: "x y z roll pitch yaw".
pos_tracking/path_pub_rate (float, default: 2.0)
  • Frequency (Hz) of publishing of the path messages
pos_tracking/path_max_count (int, default: -1)
  • Maximum number of poses kept in the pose arrays (-1 for infinite)

Mapping Parameters

mapping/mapping_enabled (bool, default: false)

  • Enable/disable the mapping module
mapping/resolution (float, default: 0.1)
  • Resolution of the fused point cloud [0.01, 0.2]
mapping/max_mapping_range (float, default: -1.0)
  • Maximum depth range used for mapping, in meters (-1 to use recommended range depending on the selected resolution) [2.0, 20.0]
mapping/fused_pointcloud_freq (float, default: 1.0)
  • Publishing frequency (Hz) of the 3D map as fused point cloud

Object Detection Parameters [only ZED 2]

object_detection/od_enabled (bool, default: false)

  • Enable/disable the Object Detection module
object_detection/confidence_threshold (int, default: 50)
  • Minimum value of the detection confidence of an object [0,100]
object_detection/object_tracking_enabled (bool, default: true)
  • Enable/disable the tracking of the detected objects
object_detection/people_detection (bool, default: true)
  • Enable/disable the detection of persons
object_detection/vehicle_detection (bool, default: true)
  • Enable/disable the detection of vehicles

Dynamic Parameters

Dynamic parameters cannot have a namespace, so they cannot be placed within their correct module. A note is added in the description of the parameter to reflect the correct module they belong.

pub_frame_rate (float, default: 15.0)

  • [General] Frequency of the publishing of Video and Depth images (equal or minor to grab_frame_rate value) [0.1,60.0]
depth_confidence (int, default: 100)
  • [Depth] Threshold to reject depth values based on their confidence. Each depth pixel has a corresponding confidence. A lower value means more confidence and precision (but less density). An upper value reduces filtering (more density, less certainty). A value of 100 will allow values from 0 to 100. (no filtering). A value of 90 will allow values from 10 to 100. (filtering lowest confidence values). A value of 30 will allow values from 70 to 100. (keeping highest confidence values and lowering the density of the depth map). The value should be in [1,100]. By default, the confidence threshold is set at 100, meaning that no depth pixel will be rejected.
depth_texture_conf (int, default: 100)
  • [Depth] Threshold to reject depth values based on their textureness confidence. A lower value means more confidence and precision (but less density). An upper value reduces filtering (more density, less certainty). The value should be in [1,100]. By default, the confidence threshold is set at 100, meaning that no depth pixel will be rejected.
point_cloud_freq (float, default: 15.0)
  • [Depth] Frequency of the pointcloud publishing (equal or minor to grab_frame_rate value) [0.1,60.0]
brightness (int, default: 4)
  • [Camera] Defines the brightness control [0,8]
contrast (int, default: 4)
  • [Camera] Defines the contrast control [0,8]
hue (int, default: 0)
  • [Camera] Defines the hue control [0,11]
saturation (int, default: 4)
  • [Camera] Defines the saturation control [0,8]
sharpness (int, default: 3)
  • [Camera] Defines the sharpness control [0,8]
gamma (int, default: 8)
  • [Camera] Defines the sharpness control [1,9] Note: available with ZED SDK >=v3.1
auto_exposure_gain (bool, default: true)
  • [Camera] Defines if the Gain and Exposure are in automatic mode or not
gain (int, default: 100)
  • [Camera] Defines the gain control [only if auto_exposure_gain is false] [0,100]
exposure (int, default: 100)
  • [Camera] Defines the exposure control [only if auto_exposure_gain is false] [0,100]
auto_whitebalance (bool, default: true)
  • [Camera] Defines if the White balance is in automatic mode or not
whitebalance_temperature (int, default: 42)
  • [Camera] Defines the color temperature value (x100) [42,65]

Services

start_svo_recording (zed_interfaces/start_svo_recording)

  • Starts recording an SVO file. If no filename is provided the default zed.svo is used. If no path is provided with the filename the default recording folder is ~/.ros/
stop_svo_recording (zed_interfaces/stop_svo_recording)
  • Stops an active SVO recording
start_remote_stream (zed_interfaces/start_remote_stream)
  • Starts streaming over network to allow processing of ZED data on a remote machine. See Remote streaming
stop_remote_stream (zed_interfaces/stop_remote_stream)
  • Stops streaming over network
set_pose (zed_interfaces/set_pose)
  • Sets the current camera pose in map frame to the value passed as parameter. For example, can be used with GPS position corrections.
reset_tracking (zed_interfaces/reset_tracking)
  • Resets position in /map frame to the initial pose value available in the param server.
reset_odometry (zed_interfaces/reset_odometry)
  • Resets the odometry to the last pose in map frame received from positional tracking. It can be used to reset to the current "map position" when the odometry drift is large.
set_led_status (zed_interfaces/set_led_status)
  • Sets the status of the blue led -> True: LED ON, False: LED OFF
toggle_led (zed_interfaces/toggle_led)
  • Toggles the status of the blue led, returning the new status
start_3d_mapping (zed_interfaces/start_3d_mapping) stop_3d_mapping (zed_interfaces/stop_3d_mapping)
  • Stops the Spatial Mapping processing (works also with automatic start from configuration file)
start_object_detection (zed_interfaces/start_object_detection)
  • Starts the Object Detection processing. See Object Detection Note: returns error if not using a ZED 2 camera
stop_3d_mapping (zed_interfaces/stop_3d_mapping)
  • Stops the Object Detection processing (works also with automatic start from configuration file)

Nodelets

The zed_wrapper package contains the standalone ZED Wrapper node that can be started as is using the provided launch files, as described in the ZED Wrapper Node documentation.

But the ZED Wrapper has been designed to take the maximum advantage of the nodelet package, in order to run multiple algorithms in the same process with zero copy transport between algorithms.

The core of the ZED Wrapper is indeed the ZEDWrapperNodelet nodelet, available in the zed_nodelets package that provides the interface between the ZED SDK and the ROS environment.

zed_nodelets/ZEDWrapperNodelet

Same usage as the zed_wrapper_node.

Available as: zed_nodelets/ZEDWrapperNodelet

zed_nodelets/RgbdSensorsSyncNodelet

Sometimes it is useful to synchronize and mux a few topics in one single topic using their timestamp, for example to save a rosbag or to guarantee that all the topics published in the same instant are received on the other side of a network. To perform this operation we provide the nodelet zed_nodelets/RgbdSensorsSyncNodelet that allows to synchronize RGB, Depth, IMU and Magnetometer information in a single custom topic RGBDSensors.

Subscribed Topics

rgb/image_rect_color (sensor_msgs/Image)
  • RGB image stream.
depth/depth_registered (sensor_msgs/Image)
  • Registered depth image stream.
rgb/camera_info (sensor_msgs/CameraInfo)
  • RGB camera metadata.
imu/data (sensor_msgs/Imu)
  • Inertial data
imu/mag (sensor_msgs/MagneticField)
  • Magnetometer data

Published Topics

rgbd_sync (zed_interfaces/RGBDSensors)
  • The sync topic with RGB, Depth, Imu and Magnetometer data

Parameters

~zed_nodelet_name (string, default: zed_nodelet)
  • The name of the ZED nodelet publishing the topics to be synchronized.
~approx_sync (bool, default: true)
  • Use approximate synchronization for the input topics. If false all the message must have the same timestamp, this is almost impossible if subscribing also to IMU and Magnetometer topics and the parameter sensors_timestamp_sync is false in the ZED nodelet.
~queue_size (int, default: 600)
  • Size of message queue for each synchronized topic.
~sub_imu (bool, default: true)
  • Synchronize IMU messages.
~sub_mag (bool, default: true)
  • Synchronize Magnetometer messages.

zed_nodelets/RgbdSensorsDemuxNodelet

A RGBDSensors topic message cannot be used by the standard ROS node. You must create your own node that subscribes it and extract all the available data. To make it easier to access the information stored in the RGBDSensors topic, we provide a demux nodelet: RgbdSensorsDemuxNodelet.

Subscribed Topics

rgbd_sync (zed_interfaces/RGBDSensors)
  • The sync topic with RGB, Depth, Imu and Magnetometer data to be demuxed

Published Topics

rgb/image_rect_color (sensor_msgs/Image)
  • RGB image stream.
depth/depth_registered (sensor_msgs/Image)
  • Registered depth image stream.
rgb/camera_info (sensor_msgs/CameraInfo)
  • RGB camera metadata.
imu/data (sensor_msgs/Imu)
  • Inertial data (only if published by zed_nodelets/RgbdSensorsSyncNodelet)
imu/mag (sensor_msgs/MagneticField)
  • Magnetometer data (only if published by zed_nodelets/RgbdSensorsSyncNodelet)

Examples and Tutorials

With the release v3.0 of the ZED Wrapper the examples and the tutorials have been removed from the main repository and a new special repository has been created.

Examples

  • zed_nodelet_example: shows how to use the nodelet intraprocess communication.

  • zed_rtabmap_example: shows how to use the ZED with the RTABMap package to generate a 3D map with the rtabmap_ros package.

  • zed_ar_track_alvar_example: shows how to use the ZED with the ar_track_alvar package to detect and track the position of AR tags.

Tutorials

  • zed_video_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Image to retrieve the Left and Right rectified images published by the ZED node.

  • zed_depth_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Image to retrieve the depth images published by the ZED node and to get the measured distance at the center of the image.

  • zed_tracking_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type geometry_msgs/PoseStamped and nav_msgs/Odometry to retrieve the position and the orientation of the ZED camera in the map and in the odometry frames.

  • zed_obj_det_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type zed_interfaces/objects to retrieve the list of objects detected by a ZED 2 camera Object Detection module.

  • zed_sensors_sub_tutorial: in this tutorial you will learn how to write a simple node that subscribes to messages of type sensor_msgs/Imu, sensor_msgs/MagneticField, sensor_msgs/Temperature and sensor_msgs/FluidPressure published by ZED Mini and ZED 2.

Output visualization

Report an Issue

If you encounter an issue with the zed-ros-wrapper, please refer to the dedicated issue list on Github.

Wiki: zed-ros-wrapper (last edited 2020-03-20 18:44:33 by Walter Lucetti)