Only released in EOL distros:  

chair_grasping: chair_recognition | estimate_grasp_positions | grasp_motion

Package Summary

The aim of this stack is to grasp an office chair safely. To reach this aim we must consider the recognition of the chair and a resulting chair grasp motion. For a better on-use we split up the project-task into three subparts. So each part describe a significant section of the whole task. We have a part for the recognition, the estimation of the grasp positions and the generation of the agitation.

Project Statement

Today, the mobile robotic faces a number of problems. Things which are trivial for us as humans are very difficult for the robot. These involve the localisation in the environment via the recognition of objects and the concrete manipulation of the objects.
We want the robotic system to be a kind of assistance for the humans. For example in a one person household the robot could be of assistance for handicapped persons. It is important that such conduct must take place without the necessity to adjust the robot. On the contrary, the robot must be able to adapt itself to its environment.
With the help of cameras and sensors, the robot is able to perceive his environment. He gets the knowledge where the obstacles are but he don't know which kind of obstacle. With algorithms we tried to get semantic information. Information about which points belong to which obstacle and what kind of object is behind it.
However, due to sensor noise and ambient occlusions extracting semantic information from sensor data is difficult. These faulty or missing information must be found and corrected by suitable algorithms so that the robot is able to structure his environment.

Our Approach

In our project we realised a chair grasping motion.
To do so, we used RGB-D to get information about the environment. Based on these data we tried to find the previously recorded model. The recognition of the object is performed by NARF-features.
After the robot has found the model within the environment data it will be mapped to the coordinates of the environment. Now we are able to say on which position the object (chair) is. We made the assumption, that the robot only shall grasp the chair on the top of the chairback. For grasping we are only interested on a few special points of the chair. We have to reduce all irrelevant information. For reducing we used a passthrough filter. This exfiltrates the data along a specified dimension. It is like cutting off all values that are either inside or outside a given user defined range. As a result of this we get only the top of the back which contains the points we are interested in.
Now we had to estimate the grasp positions. Here we assumed that the robot shall only grasp from the side. So we reduced our dataset to only two points which are the leftest and rightest point of the chairback. For a safe grasping we have to ensure that these points have to be at least a robots-hand-wide distance below the highest point of the chair.
These two grasppoints we used as parameters for the generating of the motion. For this we made use of some basic helpers of the bosch manipulation utils to control the movement of the arms.
Now we are able to grasp the chair.

Documentation

Prerequirements

  • a PR2 robot
  • a kinect on the head of the PR2
  • a well lit room
  • a office chair
  • a model of the chair
  • enough room around the PR2

Overview

In this stack we generate a graspmotion for grasping an object (chair).
The mainidea therefore is bringing the robot into a pregrasp-position. From this position it is possible to grasp the object safely. In the following we describe the individual parts in a more detailed way.

In memory this are our individual parts:

  1. chair_recognition

  2. estimate_grasp_positions

  3. grasp_motion

Cases

In order to guarantee a safety motion we must distinguish the orientation of the chair. For simplification we only consider two different cases of the orientation of the chair.

  • different cases

chair_recognition

In the chair_recognition package we will realize if there is an object regarding to our model, in front of the robot. If there is one, it would be published.

estimate_grasp_position

Here we receive the published object, reduce it and consider only a scrape of the backrest of the chair. For reducing the chairmodel to the scrape we use the PassThrough filter.
Considering only the scrape we estimate the positions for the Gripper.

grasp_motion

With the grasppositions we generate a trajectory for the arms of the robot. The difference between the pregrasp-position and the grasp-position is only a variation of the y-coordinates.


How does it work

The model we used

Installation

The following steps need to be done on the robot.
To compile this stack from source you need to checkout the code of the stack from the svn link stated above, add the source directory to your ROS_PACKAGE_PATH (e.g. in ~/.bashrc), source that file and run.

  • rosmake chair_grasping

Start Setting

First of all we have to arrange an office chair, or a chair which is equivalent with the model infront of the robot in that way, that the back of the chair is parallel to the y-axis of the robot with respect to the /torso_lift_link. (simplest case).
On the other hand we could replace the model so that we could recognize another object. To replace the model please refer to 3.3.3..

Launching some basic controller and set basic settings

Basic Controller

Execute the following commands on the PR2:

  •  roslaunch pr2_teleop_general pr2_teleop_general_joystick_bodyhead_only.launch
     roslaunch openni_camera openni_node.launch

For executing the following command you have to install the simple_robot_control package.

  • roslaunch simple_robot_control simple_robot_control_without_collision_checking.launch

If you interested at a visualization you also start rviz on the basestation. Which clouds should be visualized see below...

Basic Settings

With the PR2-controller we raise the robots backbone so that his shoulder is higher than the chair. This is not really necessary but if you do so, the simple_robot_control actionserver would get a better kinematic result.
The head of the robot should tilt until the kinect capture the chair. The best supervision you get using rviz to check.

Object Recognition

Execute the following command in the chair_recognition directory on the PR2 for launching the object recognition

  •  roslaunch chair_recognition.launch

You have to guarantee that the model of the chair exists. To do so look at the launch-file and check the following line:

  •  <param name="object_name" value="$(find chair_recognition)/office_chair_model.pcd" />

For the value specify your absolute file path of the model.

Estimation of the Gripper Points

For estimation of the Gripper Points execute the following command on the PR2 in the estimation_grasp_positions directory.

  •  rosrun subscriber_test

On the commandline you get some output data. If the chair isn't correctly recognized there will be no output.
The recognition are also seen in rviz, compare the screenshot below. The colored surface is the part of the chair we consider for specify the grasp points.

Generate the motion

With the following command in the movement directory you will generate the motion of the robot. Execute it on the PR2.

  •  roslaunch movement.launch

Calling the rosservice

For executing the motion we call the following rosservices:

  •  rosservice call /grasp_chair   -> grasping the chair
     rosservice call /release_chair  -> going back to the start-pose

The service /grasp_chair runs the program only ones. If it will be needful to correct the orientation of the chair take a recall of the service until the robot grasp the chair correctly.

  •  rosservice call /grasp_chair || rosservice call /grasp_chair


Which clouds should be visualized in rviz

For best monitoring you should visualize the following clouds in rviz:

If you visualize all the clouds above you should get some similar to the screenshot.

  • screenshot from rviz

The colored surface of the chair is the cloud which is generated by the passthrough-filter.


Video

Watch a video of the PR2 grasping a chair on youtube.

Limitation

We talked above about the simplest case, the back of the chair stands parallel to the robot.
But what append if this is not the case?
If the chair is rotated a little bit, clockwise or against clockwise...
Until a rotation about approximately 30 to 40 degree the robot is able to start a try of correction of the situation. Here we make a hard assumption. We say the robot could reach the gripper_point which is closer to him.
So he grasp it and bring the point on a new position.
Then we have to recall the service and the robot try correctly grasping again.

Outlook

The problem with the correction is that it is only a kind of unflexible try because there are no knowledge about the pivot of the chair.
If there would be a possibility to avoid this gap we would be able to leaf the contingency. If there are enough information about the center of gravity and the pivot of the chair we are in a position to get an accurate correction of the orientation with the help of the arc.

Report a Bug

<<TracLink(REPO COMPONENT)>>

Wiki: chair_grasping (last edited 2012-07-29 21:44:15 by JanMetzger)