Overview

humanoid_localization provides 6D Monte Carlo localization (x,y,z, roll, pitch, yaw) specifically for humanoid robots. A distance sensor (e.g. 2D laser or depth camera) is used for either endpoint- or raycasting-based integration in an existing 3D OctoMap world model from octomap_server. Additionally, IMU and odometry data is fused in the particle filter. While designated for humanoid robots (tested with on a Nao with Hokuyo URG-04LX laser), the code may also be useful e.g. for UAVs.

Details are available in the corresponding publication Humanoid Robot Localization in Complex Indoor Environments by A. Hornung, K. M. Wurm, and M. Bennewitz (IROS 2010).

@INPROCEEDINGS{hornung10iros,
  author = {Armin Hornung and Kai M. Wurm and Maren Bennewitz},
  title = {Humanoid Robot Localization in Complex Indoor Environments},
  booktitle =  {Proc.~of the IEEE/RSJ International Conference on Intelligent 
  Robots and Systems (IROS)},
  year = 2010,
  address = {Taipei, Taiwan},
  month= {October}
}

humanoid_localization will request a 3D occupancy map from octomap_server, so it needs to be running with the map loaded that you want to localize in. For the sensor model, octomap is used for either raycasting or endpoint distance lookup (the latter currently requires libdynamicEDT3D from OctoMap 1.5 or later). Details about the required sensor data and tf transforms can be found below. Once the localization is running, it can initialize the particles from a given pose by the intial_pose parameter, or by using the "2D Pose Estimate" tool in RViz (z, roll, and pitch will be determined automatically). For visualization, set the fixed frame to "/map" in RViz.

ROS Node API

localization_node

The main 6D Monte Carlo localization node. Only the most important topics and parameters are documented, check out the source code / yaml files for more details. The code is partly parallelized with OpenMP and will use as many threads as you have processor cores available. You can explicitly set the number of threads with the environment variable OMP_NUM_THREADS.

Subscribed Topics

scan (sensor_msgs/LaserScan)
  • 2D laser scan to be integrated
point_cloud (sensor_msgs/PointCloud2)
  • 3D point cloud to be integrated
imu (sensor_msgs/Imu)
  • IMU sensor message (only roll and pitch will be used, if parameter ~use_imu is enabled)

Published Topics

pose (geometry_msgs/PoseWithCovarianceStamped)
  • Localized particle pose, published after each laser /point cloud callback
pose_eval (geometry_msgs/PoseWithCovarianceStamped)
  • Localized particle pose, published only when a sensor integration happened in the laser /point cloud callback
~particlecloud (geometry_msgs/PoseArray)
  • The complete particle distribution
~best_particle (geometry_msgs/PoseArray)
  • The best particle (highest weight or averaged, depends on parameter ~best_particle_as_mean)

Parameters

~odom_frame_id (string, default: odom)
  • Odometry frame
~base_frame_id (string, default: torso)
  • Base robot frame frame
~global_frame_id (string, default: /map)
  • Static global map frame for the provided tf transform
~init_global (bool, default: false)
  • Whether to perform global localization
~num_particles (int, default: 500)
  • Number of particles to use
~neff_factor (double, default: 1.0)
  • Factor to determine resampling based on the number of effective particles, between 0 (=never) and 1 (always).
~num_sensor_beams (int, default: 48)
  • Number of beams to use of the sensor, the laser scan or point cloud will be subsampled to that number.
~use_raycasting (bool, default: true)
  • Whether to use the "raycasting" or "endpoint" sensor model. The first is more time-consuming (albeit parallelized), while the latter is only an approximation.
~use_imu (bool, default: false)
  • Whether to use the IMU angles (only roll and pitch) in the particle filter.
~update_min_trans (double, default: 0.2)
  • Translational movement required before performing a filter update (in m)
~update_min_rot (double, default: π/6.0)
  • Rotational movement required before performing a filter update (in rad)
~transform_tolerance (double, default: 0.1)
  • Time in sec. with which to post-date the transform that is published, to indicate that this transform is valid into the future.
~initial_pose/{x/y/z/roll/pitch/yaw} (double)
  • Initial pose mean as 3D pose, used to initialize filter with Gaussian distribution.
~initial_std/{x/y/z/roll/pitch/yaw} (double)
  • Initial pose standard deviation of the 3D pose, used to initialize filter with Gaussian distribution.

Required tf Transforms

odomtorso
  • Required transform of the robot's base link ("torso" for humanoids, changeable with parameter base_frame_id) in the odometry frame (changeable with parameter odom_frame_id).
torsosensor data frame
  • Required transform from the robot's base link ("torso" for humanoids, changeable with parameter base_frame_id) to the sensor frame.

Provided tf Transforms

/mapodom
  • Localization publishes the transform from map (global frame, changeable with parameter global_frame_id) to odom (odometry frame, changeable with parameter odom_frame_id)

Wiki: humanoid_localization (last edited 2013-01-23 10:34:57 by ArminHornung)