Only released in EOL distros:
Package Summary
Contains a working demo of the hrl_clickable_world interface.
- Author: Author: Kelsey Hawkins, Advisor: Prof. Charlie Kemp (Healthcare Robotics Lab at Georgia Tech)
- License: BSD
- Source: git https://code.google.com/p/gt-ros-pkg.hrl/ (branch: master)
EXPERIMENTAL
Overview
Behaviors currently available
move_floor - moves the base to a place on the floor
table_approach - approaches a table for tabletop manipulation
grasp_object - grasps object using pr2_grasping_behaviors overhead grasp
place_object - places object using pr2_grasping_behaviors overhead place
Running the Demo
To get this demo successfully working on your robot, you must remap most of the image and pointcloud topics so that they reference your preferred perception. The interface assumes your point_cloud and image generally overlap. The following lines of code run the demo on Georgia Tech's PR2:
roslaunch pixel_2_3d run_srv.launch roslaunch gt_pr2 gt_map_nav.launch # launches both a global map (you must provide your own) and a navigation stack roslaunch hrl_pr2_lib openni_kinect_polled.launch # launches our Kinect nodes roslaunch hrl_clickable_behaviors clickable_world_demo.launch # The majority of the demo code is launched from here.