!
Note: The whole calibration process is not complete yet for the Care-O-bot 4. This tutorial only explains the current unfinished state of the calibration procedure. |
Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags. |
Automatic camera and kinematic calibration of Care-O-bot 4
Description: This tutorial explains how to run the calibration for the Care-O-bot 4. Note that the whole calibration process is not working yet and this tutorial only covers the current unfinished state of the cob_calibration.Tutorial Level: INTERMEDIATE
Contents
Overview
This tutorial explains how to run cob_calibration on the Care-O-bot 4. This tutorial currently covers the configuration, position generation, generated positions visualisation, data acquisition and intrinsic monocular camera calibration.
As the automatic DH-parameter generation remains to be implemented, the actual extrinsic calibration algorithm can not yet be run for the cob4.
The current state of the implementation of cob_calibration for the Care-O-bot 4 is as follows:
We can automatically generate a list of proposed calibration positions for the defined arm(s). After generating the positions, we run the arm(s) through each generated positions checking if the checkerboard is visible to any of the defined cameras and if so, save the image data and the joint values into a bag file. Then we run the intrinsic calibration which, if we have a sufficient amount of good image samples in the bag file, calculates the intrinsic calibration parameters for the defined camera(s) and uploads the parameters to the camera driver.
Running the calibration
Generate positions
Launch the calibration position generator by calling:
roslaunch cob_calibration_executive generate_calibration_positions.launch
The position generation tries to find an inverse kinematic solution for each point formed by discretising an imaginary cuboid explained earlier. The cuboid can be visualised in rviz during the position generation by adding a MarkerArray display and setting the Marker Topic to /marker_array_limits.
For each found solution the whole arm trajectory is saved into a yaml file, which will be used later by the data collection. The position generation might take several minutes and is directly proportional to the level of discretisation.
Visualise generated positions
Launch the generated positions visualisation by calling:
roslaunch cob_calibration_executive visualize_cb_positions.launch
This will visualise the generated positions in rviz for diagnostic purpose.
In rviz add a MarkerArray display and select /marker_array_cb_positions as a Marker Topic.
Note that this will visualise the end effector positions at each calibration position (not pose) and the actual checkerboard will be lower or higher depending on the desired pose of the checkerboard, which was defined in the configuration file.
Collect calibration data
Launch the data collection by calling:
roslaunch cob_calibration_executive collect_data.launch
The robot arm will now go through each calibration position by iterating the list of generated trajectories one arm at a time.
At each calibration position the robot checks if it can see the checkerboard entirely with any of the cameras defined in cameras.yaml and in that case saves the image raw data and the arm joint values to a bag file.
The calibration data will be saved in /tmp/cal/cal_measurement.bag.
Intrinsic camera calibration
Launch the intrinsic camera calibration by calling:
roslaunch cob_camera_calibration calibrate_mono.launch
This will run the intrinsic calibration for all the cameras for which we have enough data in the generated bag file. The algorithm needs a certain amount of sufficient calibration samples in order to generate a good calibration; if there is not enough samples it will inform you about that and will not upload the calibration to the camera driver.
If only some of the cameras have sufficient samples it will upload the calibration to the drivers only for those cameras.
Note that not only the amount of samples play a part here but also the diversity of the samples (distance and pose of the checkerboard relative to the camera).