Only released in EOL distros:
Package Summary
This package uses the ar_kinect module to calibrate a fixed joint between the base link and a Kinect camera which monitors a Katana arm equipped with an AR code.
- Author: Michael Goerner
- License: GPL
- Source: git https://github.com/uos/katana_driver.git (branch: fuerte)
Package Summary
This package uses the ar_kinect module to calibrate a fixed joint between the base link and a Kinect camera which monitors a Katana arm equipped with an AR code.
- Author: Michael Goerner
- License: GPL
- Source: git https://github.com/uos/katana_driver.git (branch: groovy)
Contents
This package uses the ar_kinect module to calibrate a fixed joint between the base link and a Kinect camera which monitors a Katana arm equipped with an AR code. Make sure to merge the latest uos-public-fuerte.rosinstall to get the correct ar_kinect and ccny_vision stacks.
Tutorial
First of all you need to attach an AR code to the arm. The trajectory of the calibration was choosen to expose the front side of katana_motor4_lift_joint best.
If you want to use the standard setup, you can print the AR code from data/4x4_9.ps at 42mm side length and stick it to the proper place.
Here comes the tricky part... You need to provide a transform between /base_link and the center of the pattern, which is expected to be called /kinect_pattern. This could look something like:
rosrun tf static_transform_publisher .075 -.035 0.000 .151 1.571 -1.571 /katana_motor4_lift_link /katana_pattern 100
You can also just include it with the rest of your robots urdf description.
Now, fire up your usual setup including the kinect node and the katana driver. Make sure all reachable positions in front of the katana arm can be seen by the Kinect and move all possible obstacles out of the way! The arm will not perform any kind of obstacle avoidance and will move through most of the reachable space during calibration.
Now, you can simply run the following to calibrate:
roslaunch katana_kinect_calibration calibrate.launch
If you want to monitor the calibration procedure in rviz, make sure your kinect_link frame is not connected to the rest of your robot's tf tree and add publish_tf:=true to the above. This will show you the averaged position of the Kinect w.r.t. your robot for each pose independently.
The calibration will move the arm through 6 poses and the module will try to detect the AR marker a sufficient number of times. If the calibration does not find the pattern often enough (or not at all), make sure illumination is not too bright and if possible indirect!(shading the pattern with your hand might even help...)
When finished, the arm will move to the old pose and a config file is written to $HOME/.ros/kinect_pose.urdf.xacro. This can be used in a robot description to glue the Kinect to the base link using e.g.
<joint name="kurtana_kinect_joint" type="fixed"> <parent link="base_link" /> <child link="kinect_link" /> <include filename="$(env HOME)/.ros/kinect_pose.urdf.xacro" /> <origin xyz="${kinect_xyz}" rpy="${kinect_rpy}"/> </joint>
The launch file also provides all other parameters of the configure.py node as args, so you can influence the whole procedure quite a bit.
ROS API
This package provides only one node calibrate.py which will perform the actual calibration procedure, write the calibrated parameters and terminate.
calibrate.py
performs calibration of kinect pose using katanaSubscribed Topics
/katana_joint_states (sensor_msgs/JointState)- needed to restore the old pose after the calibration
- publishes the recognized AR codes
Services Called
katana_arm_controller/joint_movement_action (katana_msgs/JointMovementAction)- move the arm to a specific JointState
Parameters
~samples_required (int, default: 300)- How many samples should be collected in each pose
- How many seconds should we wait for the samples
- How many times should all poses be measured
- Where to write the final configuration to
- Should we (over?)write the config file
- Should we publish the transform we currently averaged
Required tf Transforms
/katana_pattern → /base_link- (most likely transitive) transform that fixes the AR pattern
Provided tf Transforms
/kinect_link → /base_link- optionally monitor calibration using publish_tf:=true