Only released in EOL distros:
- Author: sturm
- License: BSD
- Repository: alufr-ros-pkg
- Source: svn https://alufr-ros-pkg.googlecode.com/svn/tags/stacks/articulation/articulation-0.1.3
- Author: sturm
- License: BSD
- Source: svn http://alufr-ros-pkg.googlecode.com/svn/trunk/articulation
Tutorials on Articulated Objects
- Learning Kinematic Models for Articulated Objects using a Webcam
This tutorial is a step-by-step guide that instructs users how to learn kinematic models of articulated objects only by using a webcam and a laptop.
- Getting started with Articulation Models
This tutorial guides you step-by-step through the available tools for fitting, selecting and displaying kinematic trajectories of articulated objects. We will start with preparing a text file containing a kinematic trajectory. We will use an existing script to publish this trajectory, and use the existing model fitting and selection node to estimate a suitable model. We will then visualize this model in RVIZ.
- Using the Articulation Models (Python)
In this tutorial, you will create a simple python script that calls ROS services for model fitting and selection. The script will output the estimated model class (like rotational, prismatic, etc.) and the estimated model parameters (like radius of rotation, etc.).
- Using the Articulation Model Library (C++)
This tutorial demonstrates how to use the articulation model library directly in your programs. This is more efficient than sending ROS messages or ROS services. In this tutorial, a short program is presented that creates an artificial trajectory of an object rotating around a hinge, and then uses the model fitting library to recover the rotational center and radius. Further, the sampled trajectory and the fitted model are publishes as a ROS message for visualization in RVIZ.
- Learning Kinematic Models from End-Effector Trajectories
This tutorial demonstrates the process of model fitting and model selection to real data recorded by a mobile manipulation robot operating various doors and drawers.
structure_learnerThis node receives trajectories of all parts of a kinematic object in form of articulation_msgs/ArticulatedObjectMsg messages. For every pair of object parts, it fits the motion between these two parts to all known model classes, and selects the the kinematic structure that maximizes the BIC (Bayesian Information Criterion). The node publishes the result again in form of a articulation_msgs/ArticulatedObjectMsg message, now containing additionally the individual link models.
- Request: pose trajectories of all parts of an articulated object, as observed by the robot. Response: the resulting kinematic model, after model fitting and structure selection.
Parameters~sigma_position (float, default: 0.01)
- This parameter specifies the standard deviation of the assumed Gaussian noise in the orientation of the observed trajectories. Unit: meters.
- This parameter specifies the standard deviation of the assumed Gaussian noise in the Cartesian position of the observed trajectories. Unit: radians.
- This parameter specifies which model classes are considered during model fitting and model selection. See the section on model classes for a complete list of available models. See the tutorial for how to add custom model classes.
Jürgen Sturm, Advait Jain, Cyrill Stachniss, Charlie Kemp, Wolfram Burgard. Operating Articulated Objects Based on Experience. In Proc. of the International Conference on Intelligent Robot Systems (IROS), Anchorage, USA, 2010. [ pdf ] [ bibtex ]
Jürgen Sturm, Kurt Konolige, Cyrill Stachniss, Wolfram Burgard. Vision-based Detection for Learning Articulation Models of Cabinet Doors and Drawers in Household Environments. In Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Anchorage, USA, 2010. [ pdf ] [ bibtex ]
Jürgen Sturm, Cyrill Stachniss, Vijay Pradeep, Christian Plagemann, Kurt Konolige, Wolfram Burgard. Learning Kinematic Models for Articulated Objects. In Proc. of the International Joint Conference on Artificial Intelligence (IJCAI), Pasadena, USA, 2009. [ pdf ] [ bibtex ]
More information (including videos, papers, presentations) can be found on the homepage of Jürgen Sturm.
Report a Bug
If you run into any problems, please feel free to contact Juergen Sturm <email@example.com>.