Note: This tutorial assumes that you have completed the previous tutorials: Camera Frames, Calibration Patterns.
(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Performing a Hand-Eye Calibration

Description: This tutorial shows how a hand-eye calibration can be performed with the Ensenso camera node.

Tutorial Level: ADVANCED

With a hand-eye calibration you can determine the position of the camera relative to a robot. The NxLib supports two different types of setup: The camera can either be fixed with respect to the robot base or it can be mounted on the robot (e.g. on the wrist) and move with it. For more information and supporting depictions on the hand-eye calibration see the Ensenso SDK Manual.

When the camera is fixed, the calibration can find out the position of the camera relative to the robot base. In order to perform that kind of calibration, you need to grab a calibration pattern with the robot hand and move it to different positions where the camera can see it.

When the camera is mounted on e.g. the robot's wrist, the calibration can find out the exact position of the camera relative to the wrist. In order to perform that kind of calibration, you need to place a calibration pattern in the robot's workspace and capture images of it from different robot poses.

You can specify the type of calibration with the fixed parameter of the camera node.

General Hand-Eye Calibration Workflow

The steps of the calibration are all performed with the calibrate_hand_eye action (ensenso_camera_msgs/CalibrateHandEye) and correspond to the following goal commands:

  • RESET: Reset the pattern buffer by deleting any previous observations of calibration patterns.

  • CAPTURE_PATTERN: Repeatedly move the robot to a new position and capture the pattern. The current robot pose is automatically fetched from tf (based on the robot base frame and the wrist frame that are specified in the node's parameters). It is important that the robot does not move until the action is done. Otherwise the captured pattern and the corresponding robot pose might be inconsistent.

  • START_CALIBRATION: Start the calibration after at least 5 patterns have been collected.

After the calibration is done, the camera's internal link is updated so that the transformation between the camera and the robot wrist (or base respectively) is taken into account for all future data.

We provide the following calibrate_handeye script that can be used for both the moving and the fixed hand-eye calibration:

   1 #!/usr/bin/env python
   2 import rospy
   3 import sys
   4 import actionlib
   5 from actionlib_msgs.msg import GoalStatus
   6 from ensenso_camera_msgs.msg import CalibrateHandEyeAction, CalibrateHandEyeGoal
   7 
   8 
   9 def reset_patterns(hand_eye_client):
  10     rospy.loginfo("Resetting pattern buffer...")
  11     goal = CalibrateHandEyeGoal()
  12     goal.command = CalibrateHandEyeGoal.RESET
  13     hand_eye_client.send_goal(goal)
  14     wait_for_result(hand_eye_client)
  15 
  16 
  17 def wait_for_result(hand_eye_client):
  18     hand_eye_client.wait_for_result()
  19     if hand_eye_client.get_state() != GoalStatus.SUCCEEDED:
  20         rospy.logerr("Action was not successful.")
  21     rospy.loginfo("Action successfull")
  22 
  23 
  24 def capture_pattern(hand_eye_client):
  25     rospy.loginfo("Capturing a pattern.")
  26     goal = CalibrateHandEyeGoal()
  27     goal.command = CalibrateHandEyeGoal.CAPTURE_PATTERN
  28     hand_eye_client.send_goal(goal)
  29     wait_for_result(hand_eye_client)
  30     try:
  31         # Check if a stereo pattern has been found.
  32         result = hand_eye_client.get_result()
  33         if not result.found_pattern:
  34             rospy.logerr("Did not find any patterns.")
  35     except Exception:
  36         pass
  37 
  38 
  39 def calibrate_hand_eye(hand_eye_client):
  40     rospy.loginfo("Running calibration and storing results in EEPROM.")
  41     goal = CalibrateHandEyeGoal()
  42     goal.command = CalibrateHandEyeGoal.START_CALIBRATION
  43     goal.write_calibration_to_eeprom = True
  44     hand_eye_client.send_goal(goal)
  45     wait_for_result(hand_eye_client)
  46 
  47 
  48 def main():
  49     capture_wait = rospy.get_param("~capture_wait", 10.0)
  50     count_poses = max(rospy.get_param("~count_poses", 5), 6)
  51     timeout = rospy.get_param("~timeout", 60)
  52 
  53     hand_eye_calib_client = actionlib.SimpleActionClient("calibrate_hand_eye",
  54                                                          CalibrateHandEyeAction)
  55     if not hand_eye_calib_client.wait_for_server(rospy.Duration(timeout)):
  56         rospy.logerr("The camera node is not running!")
  57         sys.exit()
  58 
  59     reset_patterns(hand_eye_calib_client)
  60 
  61     for i in range(count_poses):
  62         # ---- Let the Robot stand still here.
  63         rospy.loginfo("Pattern number: {}".format(i + 1))
  64         now = rospy.get_time()
  65         capture_pattern(hand_eye_calib_client)
  66 
  67         if i == count_poses - 1:
  68             rospy.loginfo("Finished capturing patterns")
  69             break
  70 
  71         # ---- Move the robot to the next position.
  72         time_until_capture = capture_wait - (rospy.get_time() - now)
  73         while time_until_capture > 0.8:
  74             start_loop = rospy.get_time()
  75             rospy.loginfo("Capturing next pattern in {}s.".format(time_until_capture))
  76             rospy.sleep(0.75)
  77             end_loop = rospy.get_time()
  78             time_until_capture -= end_loop - start_loop
  79         rospy.sleep(time_until_capture)
  80 
  81     calibrate_hand_eye(hand_eye_calib_client)
  82 
  83 
  84 if __name__ == "__main__":
  85     try:
  86         rospy.init_node("ensenso_camera_hand_eye_calibration")
  87         main()
  88     except rospy.ROSInterruptException:
  89         pass

First of all the script creates a hand-eye calibration client and then waits for the client to connect to the hand-eye calibration server on the camera. If the connection to the server has been established, all previously captured patterns are deleted from the buffer (line 59) by sending an action goal with the constant parameter RESET, which tells the camera to reset its pattern buffer.

The following loop captures n=count_poses images of the pattern, where count_poses is defined as a paramter that can be set via rosrun argument and is always at least 6.

At the top of the loop the robot is required to stand still, so that an image of the pattern (either placed in the robot's workspace or held by the robot) can be captured with the action goal command CAPTURE_PATTERN (line 65). After the capture_pattern() call there is time for the robot to move to the next position. The pattern is stored in the camera's pattern buffer. It is very important that the robot stands still before capturing images, because otherwise it will result in a bad calibration. The comments in the code indicate where the robot should stand still and where it should move. Optionally, you can increase the wait duration to make sure your robot stands still before a pattern capture is triggered.

When all poses haven been captured, the loop is exited and the calibration is started (line 81). By sending an action with goal command START_CALIBRATION the calibration data is calculated based on the patterns stored in the camera's buffer. With the parameter write_calibration_to_eeprom = True (line 43) the calibration is also stored in the camera's EEPROM and is now available as an internal link holding the transformation from the camera_frame to the wrist_frame of the robot. Since EEPROMs are non-volatile memory, the stored transformation is available beyond a camera shutdown or restart. Note that the camera will treat the transform camera_frame -> wrist_frame from now on internally as camera_frame -> link_frame.

Pre-Calibration

Sometimes it is useful to know whether the pattern will be visible from a certain robot position before actually moving there. You can do this with a pre-calibration.

To get an initial guess of the camera position relative to the robot wrist, you have to capture at least five patterns. Then you can start the hand-eye calibration. After it is done, the collected patterns will still be memorized, so you can continue to collect more patterns and start the calibration again when you have enough of them.

This time, though, you can use the project_pattern action (ensenso_camera_msgs/ProjectPattern) to estimate whether patterns would be visible from a hypothetical robot position. You already know the pattern pose in the camera's target frame from the initial calibration. Additionally, you can give the project_pattern action an arbitrary transformation between the camera and the target frame. It will then project the pattern into the camera as if the camera was at the given position and indicate whether the pattern would be fully visible or not via the pattern_is_visible flag contained in the action result.

Moving Hand-Eye Calibration

In a moving hand-eye calibration setup the stereo camera is usually mounted on top of the robot wrist or another moving part that is located on the robot. The pattern is usually located somewhere static in the robot's workspace.

Below is a launch file that sets up a stereo camera for a hand-eye calibration.

   1 <!-- Open an Ensenso stereo camera to perform a hand-eye calibration with. -->
   2 
   3 <launch>
   4   <arg name="serial" default=""
   5        doc="Serial of the stereo camera(default: first found camera)" />
   6   <arg name="settings" default=""
   7        doc="JSON file with camera parameters" />
   8   <arg name="camera_frame" default=""
   9        doc="The camera's tf frame (default: optical_frame_SERIAL)" />
  10   <arg name="link_frame" default=""
  11        doc="The camera's link tf frame (default: camera_frame)" />
  12   <arg name="target_frame" default=""
  13        doc="The tf frame the data will be returned in (default: camera_frame)" />
  14   <arg name="tcp_port" default="-1"
  15        doc="The TCP port to open on the NxLib (-1=off, 0=autoSelect, >0=portNumber)" />
  16   <arg name="threads" default="-1"
  17        doc="The number of threads used by the NxLib for this node (-1=autoDetect)" />
  18   <arg name="wait_for_camera" default="false"
  19        doc="Whether this node should wait for the camera to become available" />
  20   <!-- Required to be set by user. -->
  21   <arg name="robot_frame" doc="The robot's base tf frame" />
  22   <arg name="wrist_frame" doc="The robot's wrist tf frame" />
  23   <arg name="is_fixed" doc="The hand-eye calibration type: fixed or moving" />
  24 
  25   <node pkg="nodelet" type="nodelet" name="manager_" args="manager" output="screen" />
  26 
  27   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg serial)"
  28         args="load ensenso_camera/stereo_camera_nodelet /manager_" output="screen">
  29     <param name="serial" type="string" value="$(arg serial)" />
  30     <param name="settings" type="string" value="$(arg settings)" />
  31     <param name="camera_frame" type="string" value="$(arg camera_frame)" />
  32     <param name="link_frame" type="string" value="$(arg link_frame)" />
  33     <param name="target_frame" type="string" value="$(arg target_frame)" />
  34     <param name="tcp_port" type="int" value="$(arg tcp_port)" />
  35     <param name="threads" type="int" value="$(arg threads)" />
  36     <param name="wait_for_camera" type="bool" value="$(arg wait_for_camera)" />
  37     <param name="robot_frame" type="string" value="$(arg robot_frame)" />
  38     <param name="wrist_frame" type="string" value="$(arg wrist_frame)" />
  39     <param name="fixed" type="bool" value="$(arg is_fixed)" />
  40   </node>
  41 </launch>

To open a stereo camera with the above launch file for a fixed hand-eye calibration, you have to provide the following parameters:

  • wrist_frame: should be the tf frame of the camera's mounting position.

  • robot_frame: should be the tf frame of the robot's base.

  • is_fixed: set it to false for a moving hand-eye calibration.

In general the call looks like this:

roslaunch ensenso_camera \
    calbrate_hand_eye.launch \
    camera_serial:=<serial of camera> \
    robot_frame:=<tf frame of robot base> \
    wrist_frame:=<tf frame of camera on robot> \
    is_fixed:=false

Now the camera should be set up correctly. You can now run the above described calibrate_handeye script as below. In parallel you have to control your robot such that it moves as required by the script.

rosrun ensenso_camera calibrate_handeye _count_poses:=10

With our provided nodelet launch file you can now open your just calibrated stereo camera. The calculated transformation from camera_frame to wrist_frame will be published by the camera with tf (as camera_frame to link_frame).

In case you want to receive the 3D data relative to the robot base you used in the calibration before (robot_frame), you will have to set the target_frame to the robot base frame. You can use any frame you want though. The camera will fetch the transformation between the link_frame and the target_frame from tf and will transform the 3D data correspondingly.

   1 <!-- Open an Ensenso stereo camera in a nodelet. -->
   2 
   3 <launch>
   4   <arg name="serial" doc="Serial of the stereo camera." default="" />
   5   <arg name="camera_frame" default="optical_frame_$(arg serial)" />
   6   <arg name="link_frame" default="$(arg camera_frame)" />
   7   <arg name="target_frame" default="$(arg camera_frame)" />
   8 
   9   <node pkg="nodelet" type="nodelet" name="manager_"  args="manager" output="screen" />
  10   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg camera)"
  11         args="load ensenso_camera/stereo_camera_nodelet /manager_" output="screen">
  12     <param name="serial" type="string" value="$(arg camera)" />
  13     <param name="target_frame" type="string" value="$(arg target_frame)" />
  14   </node>
  15 </launch>

You can run this launch file like this:

roslaunch ensenso_camera\
    nodelet.launch \
    camera_serial:=<serial of camera> \
    target_frame:=<tf frame you want to receive the data in>

Fixed Hand-Eye Calibration

In a fixed hand-eye calibration setup the stereo camera is mounted somewhere static in the scene, whereas the pattern is either mounted on the robot wrist or held by the robot in its gripper.

You can use the same launch file to set up a stereo camera for a fixed hand-eye calibration as used above for the moving case. For the sake of readability it is shown again below.

   1 <!-- Open an Ensenso stereo camera to perform a hand-eye calibration with. -->
   2 
   3 <launch>
   4   <arg name="serial" doc="Serial of the stereo camera to calibrate." />
   5   <arg name="robot_frame" doc="The robot's base frame." />
   6   <arg name="wrist_frame" doc="The robot's wrist frame." />
   7   <arg name="is_fixed" doc="The hand-eye calibration type: fixed or moving." />
   8 
   9   <node pkg="nodelet" type="nodelet" name="manager_"  args="manager" output="screen" />
  10   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg serial)"
  11         args="load ensenso_camera/stereo_camera_nodelet /manager_" output="screen">
  12     <param name="serial" type="string" value="$(arg serial)" />
  13     <param name="robot_frame" type="string" value="$(arg robot_frame)" />
  14     <param name="wrist_frame" type="string" value="$(arg wrist_frame)" />
  15     <param name="fixed" type="bool" value="$(arg is_fixed)" />
  16   </node>
  17 </launch>

To open a stereo camera with the above launch file for a fixed hand-eye calibration, you have to provide the following parameters:

  • wrist_frame: should be the tf frame of the patterns's mounting/gripping position.

  • robot_frame: should be the tf frame of the robot's base.

  • is_fixed: set it to true for a fixed hand-eye calibration.

In general the call looks like this:

roslaunch ensenso_camera \
    calbrate_hand_eye.launch \
    camera_serial:=<serial of camera> \
    robot_frame:=<tf frame of robot base> \
    wrist_frame:=<tf frame of pattern on the robot> \
    is_fixed:=true

Now the camera should be set up correctly. You can now run the above described calibrate_handeye script as below. In parallel you have to control your robot such that it moves as required by the script.

rosrun ensenso_camera calibrate_handeye _count_poses:=10

With our provided nodelet launch file you can now open your just calibrated stereo camera. The calculated transformation from camera_frame to robot_frame will be published by the camera with tf (as camera_frame to link_frame).

   1 <!-- Open an Ensenso stereo camera in a nodelet. -->
   2 
   3 <launch>
   4   <arg name="serial" doc="Serial of the stereo camera." default="" />
   5   <arg name="camera_frame" default="optical_frame_$(arg serial)" />
   6   <arg name="link_frame" default="$(arg camera_frame)" />
   7   <arg name="target_frame" default="$(arg camera_frame)" />
   8 
   9   <node pkg="nodelet" type="nodelet" name="manager_"  args="manager" output="screen" />
  10   <node pkg="nodelet" type="nodelet" name="Ensenso_$(arg camera)"
  11         args="load ensenso_camera/stereo_camera_nodelet /manager_" output="screen">
  12     <param name="serial" type="string" value="$(arg camera)" />
  13     <param name="target_frame" type="string" value="$(arg target_frame)" />
  14   </node>
  15 </launch>

You can run this launch file like this:

roslaunch ensenso_camera\
    nodelet.launch \
    camera_serial:=<serial of camera>

Wiki: ensenso_driver/Tutorials/HandEyeCalibration (last edited 2022-04-06 07:29:56 by BenjaminThiemann)