ZED Node

Published Topics

The ZED node publishes data to the following topics:

  • Left camera

    • /zed/rgb/image_rect_color : Color rectified image (left RGB image by default).
    • /zed/rgb/image_raw_color : Color unrectified image (left RGB image by default).
    • /zed/rgb/camera_info : Left camera calibration data.
  • Right camera

    • /zed/right/image_rect_color : Color rectified right image.
    • /zed/right/image_raw_color : Color unrectified right image.
    • /zed/right/camera_info : Right camera calibration data.
  • Depth and point cloud

    • /zed/depth/depth_registered : Depth map image registered on left image (by default 32 bits float, in meters).
    • /zed/point_cloud/cloud_registered : Registered color point cloud.
    • /zed/confidence/confidence_image : Confidence image.
    • /zed/confidence/confidence_map : Confidence image. (Floating Point values)
    • /zed/disparity/disparity_image : Disparity image
  • Tracking

    • /zed/odom : Absolute 3D position and orientation relative to the odometry frame. (pure visual odometry)
    • /zed/map : Absolute 3D position and orientation relative to the map frame. (Sensor Fusion algorithm)
  • Inertial Data

    • /zed/imu/data : Accelerometer, Gyroscope and Orientation data in Earth frame.
    • /zed/imu/data_raw : Accelerometer and Gyroscope data in Earth frame.

Launch file parameters

Specify your launch parameters in the zed_camera.launch file available here.

Parameter Description Value
svo_file Specify SVO filename Path to an SVO file
resolution Select ZED camera resolution ‘0’: HD2K, ‘1’: HD1080, ‘2’: HD720, ‘3’: VGA
frame_rate Set ZED camera video framerate int
sensing_mode Select depth sensing mode ‘0’: STANDARD, ‘1’: FILL
quality Select depth map quality ‘0’: NONE, ‘1’: PERFORMANCE, ‘2’: MEDIUM, ‘3’: QUALITY, ‘4’: ULTRA
openni_depth_mode Convert 32bit depth in meters to 16bit in millimeters ‘0’: 32bit float meters, ‘1’: 16bit uchar millimeters
zed_id Select a ZED camera by its ID. ID are assigned by Ubuntu. Useful when multiple cameras are connected. ID is ignored if an SVO path is specified. int, default ‘0’
gpu_id Select a GPU device for depth computation int, default ‘-1’ (best device found)
publish_tf Enable/Disable publish odometry TF true, false
camera_flip Flip the camera data if it is mounted upsidedown true, false
pose_frame Map frame name string, default=‘map’
odometry_frame Odometry frame name string, default=‘odom’
base_frame Base link frame name string, default=‘base_frame’
camera_frame Camera frame name string, default=‘camera_frame’
depth_frame Depth frame name string, default=‘depth_frame’
imu_frame IMU frame name string, default=‘imu_frame’
initial_pose Initial reference pose vector, default=’[0.0,0.0,0.0, 0.0,0.0,0.0]’ -> [X, Y, Z, R, P, Y]
verbose Enable/Disable the verbosity of the SDK true, false

Topic names can also be customized in the launch file.

Dynamic parameters

The ZED node let you reconfigure dinamically 5 parameter: - confidence : Confidence threshold, the lower the better - auto_exposure : Enable/Disable auto control of exposure and gain - exposure : Exposure value when manual controlled (auto_exposure=0) - gain : Gain value when manual controlled (auto_exposure=0) - mat_resize_factor : Image/Measures resize factor

Transform frame

The ZED ROS wrapper broadcasts multiple coordinate frames that each provide information about the camera position and orientation. If needed, the reference frames can be changed in the launch file.

  • zed_camera_center is the current position and orientation of ZED, determined by visual odometry and tracking algorithm.
  • zed_right_camera is the position and orientation of ZED right camera.
  • zed_right_camera_optical is the position and orientation of ZED right camera optical frame.
  • zed_left_camera is the position and orientation of ZED left camera.
  • zed_left_camera_optical is the position and orientation of ZED left camera optical frame.
  • imu_link is the origin of the inertial data frame (only ZED Mini)

For RVIZ compatibilty, the root frame pose_frame is called map. The TF tree generated by the zed_wrapper reflects the standard descripted in REP105 The odometry frame is updated using only the “visual odometry” information. The map frame is updated using the Tracking algorithm provided by the Stereolabs SDK, fusing the inertial information from the IMU sensor if using a ZED Mini camera.

map (pose_frame)
└─odom
    └─zed_camera_center
      └─zed_left_camera_frame
            └─zed_left_camera_optical_frame
      └─zed_right_camera_frame
            └─zed_right_camera_optical_frame
    └─imu_link (*only ZED Mini*)

ZED Mini

The ZED Mini provides the same information as the ZED, plus the Inertial data from the IMU sensor. The IMU data are used internally to generate the pose in the Map frame with the Tracking sensor fusion algorithm.

Note : The initial pose in Odometry frame can be initially set to the first pose received by the Tracking algorithm setting the parameter init_odom_with_imu to true.

Services

The ZED node provides the following services: - set_initial_pose : restart the Tracking algorithm setting the initial pose of the camera to the value passed as vector parameter -> [X, Y, Z, R, P, Y] - reset_tracking : restart the Tracking algorithm setting the initial pose to the value available in the param server - reset_odometry : reset the odometry values eliminating the drift due to the Visual Odometry algorithn and setting the new odometry frame to the current map frame

Using multiple ZED

It is possible to use multiple ZED cameras with ROS. Simply launch the node with the zed_multi_cam.launch file :

roslaunch zed_wrapper zed_multi_cam.launch

Assigning a GPU to a camera

To improve performance, you can specify in the launch file the gpu_id of the graphic card that will be used for the depth computation. By default, (-1) will select the GPU with the highest number of CUDA cores. When using multiple ZED, you can assign each camera to a GPU to higher the performances

Limitations

Performance

This wrapper lets you quickly prototype applications and interface the ZED with other sensors and packages available in ROS. However, the ROS layer introduces significant latency and a performance hit. If performance is a major concern for your application, please consider using the ZED SDK library.

Using multiple ZED

The ZED camera uses the full USB 3.0 bandwidth to output video. When using multiple ZED, you may need to reduce camera framerate and resolution to avoid corrupted frames (green or purple frames). You can also use multiple GPUs to load-balance computations and improve performances.