|It is appreciated that problems/questions regarding this tutorial are asked on answers.ros.org. Don't forget to include in your question the link to this page, versions of your OS & ROS, and also add appropriate tags.|
PR2 Full System CalibrationDescription: Calibrating the PR2's cameras, and kinematic parameters
Tutorial Level: INTERMEDIATE
- Restart the robot with the newest URDF
- Calibrate Camera Intrinsics
- Capture Data
- Estimate the URDF
- Update the system
- Calibration Sanity Check
This tutorial assumes that you are running at least pr2_calibration 1.0.0. Make sure you have an up to date installation on the robot
Restart the robot with the newest URDF
Before you start calibrating the robot, the robot should be started from the newest urdf file that is available. In the '/etc/ros/electric/urdf' folder (there is a specific folder for each distro), you find a number of urdf files called 'robot_uncalibrated_x.x.x.xml'. Modify the symbolic link 'robot.xml' to point to the newest uncalibrated urdf file (as root):
cd /etc/ros/electric/urdf ln -sf robot_uncalibrated_x.x.x.xml robot.xml
You can verify the current symlink using the following command:
ls -la /etc/ros/electric/urdf/robot.xml
Power cycle the MCBs using pr2_dashboard, and then restart the robot:
<<Use Dashboard to put LRB Breakers into Disable, and then into Enable>> sudo robot stop sudo robot start
Calibrate Camera Intrinsics
The stereocamera and forearm cameras must be calibrated before running the full robot calibration. See the Calibrating the PR2's Cameras Tutorial
Log into the robot with X-Forwarding, and start the data capture application
ssh prx -X robot start roslaunch pr2_calibration_launch capture_data.launch
If your robot has a kinect/xtion mounted on its head, you also need to launch the node for that:
Note: The texture projector must be off for data capture to work!
This will start all the processing nodes, along with the data capture executive. It will also open 8 image_view windows. Each provides the state of the checkerboard detector for each stream. Each currently active sensor must consistently detect the checkerboard in order for the executive to capture a sample.
Follow the command line prompts to go through the calibration.
Understanding the sensors streams
Once you start capture_data.launch, 8 image_view windows will open and will look like the following (although not as neatly arranged):
Notice that both forearm camera streams have been grayed out. Once the executive activates these streams later in the capture process, you will begin to see data from these cameras.
There is an openCV checkerboard detector running on each sensor stream. The result of the detector is shown as circles in each image frame (see image below). A successful detection is signified by a green circle at every corner, along with a red circle at one corner. Partial failures show red circles, while complete failures show no circles at all.
Capturing Large Checkerboards
During this phase, the application will prompt you to place a large checkerboard in view of all the head cameras and tilting laser. After positioning the board in a good location, press <enter> to begin capture. The app will either detect the checkerboard, or timeout in 40 seconds.
It is suggested that you collect 4 large checkerboards, each about 2 to 3 meters away from the robot. If you are feeling ambitious, you can collect 3 more checkerboards (left, centered and right) that are 1.5 to 2 meters from the robot.
Note: The checkerboard on the floor can often be very tricky to capture, due to glare from overhead lighting. If this sample is causing trouble, feel free to skip it.
Both stereo pairs (along with the tilting laser) must detect the large checkerboard in order to successfully capture a sample. Thus, you will definitely have to move the head around in order the see the checkerboard.
The teleop joystick already comes up with the capture_data app. Thus, you can use the joystick to move the head around as needed (and base around as needed)
Understanding the Tilting Laser Intensity Images
For each scan, the tilting laser provides both range and intensity data along a horizontal line. We then stack intensity data from successive together, creating the image. Note that the Hokuyo UTM-30lx scans from right to left, resulting in the intensity image being flipped horizontally.
Placing the large checkerboard close to orthogonal to the laser ray will result in very bright specular reflections. This can be solved by turning the checkerboard until there is no bright spot in the intensity image.
Finish collecting Large Checkerboards
Once you've collected enough large checkerboards, press "N" at the prompt to continue. Don't forget to remove the large checkerboard from the robot's view, as it might mess up the next step in the calibration.
Capturing Left Hand Checkerboard
You should now see the left arm move in view of the head cameras. Place the 5x4 checkerboard in the left gripper. In order to ensure that the checkerboard doesn't move during the calibration procedure. The gripper pads should completely contact the non-slip tape on the checkerboard plate (see image below). You can use the joystick to close the gripper. The left D-pad button on the joystick will close the gripper while the the right D-pad button on the joystick will open the gripper.
Also, make sure to hide the large checkerboard from the view of the robot, since it can confuse the checkerboard detector.
Press <enter> once the checkerboard is securely held in the left gripper.
WARNING: Once the checkerboard is firmly grasped by the gripper and robot has started collecting data, DO NOT DISTURB THE GRASP. The calibration procedure estimates the offset from the gripper to the CB, and assumes a constant grasp throughout the calibration procedure. If the checkerboard hits anything, the grasp can change, and the calibration data capture must be restarted.
Capturing Right Hand Checkerboard
Move the checkerboard from the left hand to the right. Press <enter> once it is securely held in the right gripper.
You will see a message at the console once the execute has completed. Remove the checkerboard from the gripper.
You now have a bagfile at /tmp/pr2_calibration/cal_measurements.bag with calibration data. You can always use "rosbag info" to see how many calibration samples you collected.
Estimate the URDF
This step will run the nonlinear optimization that estimates the system parameters, and then generate a new URDF in the current directory.
This step runs twice as fast when the robot isn't running. It is suggested that you bring down the robot and then start your own core (if you don't have a core running, some of the scripts will unexpectedly hang).
robot stop roscore
Now, in another terminal (make sure you're in a directory where you have write-permissions), start the estimation process. This will take a while to run.
rosrun pr2_calibration_launch estimate_pr2_beta_urdf.sh
After each calibration step completes, you'll see what the RMS error is for each sensor. Each step will take 5-10 minutes.
It will take approximately 25 minutes for the optimization to complete. This will put the generated URDF (robot_calibrated.xml) in the current working directory.
If you want to view the results, you can look at an experimental scatter plot of various errors.
rosrun pr2_calibration_launch view_head_laser_scatter.sh rosrun pr2_calibration_launch view_head_arm_scatter.sh rosrun pr2_calibration_launch view_forearm_scatter.sh
When viewing the scatter plots, make sure the errors on the first plot (head_laser) are within 20px of the (0,0) point, and the errors on the remaining plots are within 10px of the (0,0) point. If the plots show large errors, or bimodal distributions, your calibration will not be very good.
Update the system
Update the URDF
For people to start using this new URDF, /etc/ros/[DISTRO]/urdf/robot.xml should be symlinked to your newly generated URDF (robot_calibrated.xml) by your PR2 Administrator.
cp robot_calibrated.xml /etc/ros/[DISTRO]/urdf/robot_calibrated_[DATE]_[VERSION].xml
ln -fs /etc/ros/[DISTRO]/urdf/robot_calibrated_[DATE]_[VERSION].xml /etc/ros/[DISTRO]/urdf/robot.xml
Update Stereo Baselines
The stereo baselines are stored on the wge100 cameras, so their eeprom needs to be updated.
While the robot is running, call the following:
rosrun pr2_calibration_launch write_pr2_cam_intrinsics.sh
The motor controller boards (MCBs) store the joint calibration data. Thus, the joints won't recalibrate unless you power cycle the MCBs. Power cycle the MCBs using pr2_dashboard.
Now restart the robot:
robot stop robot start
You will now see the robot start up using its new calibrated reference positions.
Some configuration files depend on the pr2 calibration, and become invalid after re-calibrating the pr2. To make sure nobody uses the invalid files, it is best to delete them:
This will remove the capability of your PR2 to plug itself into an outlet autonomously. If you care about this capability, you can create a new calibration for plugging in, following instructions that will appear in a pr2_plugs tutorial.
Calibration Sanity Check
Before shipping a robot, Willow Garage does a calibration sanity check using rviz. The sanity check is outlined at pr2_bringup_tests/calibration_bringup
See the pr2_calibration/Troubleshooting page for help in resolving any issues