This package contains one single node: viodom_node, which estimates robot motion based on incoming raw images and IMU mesaurements from the Visual-Inertial (VI-) Sensor. To correctly estimate the motion, the node first needs to wait for a few seconds to initialize an IMU filter.

Tf tree

The transforms tree (following REP 105) is as follows:

  • odombase_linkcamera

Visual odometry algorithms generally calculate camera motion. To be able to calculate robot motion based on camera motion, the transformation from the camera frame to the robot frame has to be known. Therefore this implementation needs to know the tf base_linkcamera to be able to publish odombase_link. The node currently uses default values from the sensor setup on the AscTec Neo Research platform.



If you use viodom in an academic context, please cite the following publication: http://ieeexplore.ieee.org/document/7502653/

  author={F. J. Perez-Grau and F. R. Fabresse and F. Caballero and A. Viguria and A. Ollero},
  booktitle={2016 International Conference on Unmanned Aircraft Systems (ICUAS)},
  title={Long-term aerial robot localization based on visual odometry and radio-based ranging},

Wiki: viodom (last edited 2016-10-31 00:11:33 by FranciscoJPerezGrau)