Note: We recommend to have a look at the sFly Project, this PhD Thesis and the ROS stacks ethzasl_ptam and asctec_mav_framework. This tutorial assumes that you have completed the previous tutorials: custom_sensor_design, remote_ptam, asctec_hl_interface.
(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Tutorial on how to set-up the sFly framework

Description: This tutorial shows how to set-up the sFly framework in order to have an MAV navigating purely vision based (single camera and an IMU as only sensors). Even though you can implement this framework on any MAV, it is easiest to use it on an AscTec platform.

Tutorial Level: ADVANCED

Overview and Preparation

This tutorial is specifically for users using a platform of Ascending Technologies. However, you might find here some inspiration in order to get your own platform to fly purely vision based using the stacks ethzasl_ptam and ethzasl_sensor_fusion.

In the European project sFly we used the setup described in this tutorial to make an AscTec Firefly hexacopter fly autonomously and purely vision based (i.e. using only one camera and an IMU as sensors). The background on the algorithms used here may be best described in this PhD Thesis: chapter 2.1. for the real-time visual pose computation (ethzasl_ptam), chapter 3.1. for the EKF setup (ethzasl_sensor_fusion) and chapter 4.2 and 4.3.1. for the implementation and real-world results.

For this tutorial, we assume that you are familiar with the asctec_mav_framework and its asctec_hl_interface described in this tutorial. Furthermore, we assume to have an AscTec platform flashed with the latest firmware. Also, you need to be familiar with ethzasl_ptam and ethzasl_sensor_fusion.

Since this tutorial needs the MAV (i.e. hardware) to be in the loop, we cannot provide any data-sets.

This stack is experimental - be extremely careful during all experiments with the real hardware. Your platform, infrastructure and people near the running platform might get damaged in case of a system failure.

Setting up the Hardware

Platform: In principle, any AscTec platform can be used for this tutorial (Hummingbird, Pelican, Firefly).

Computation power: Even though ethzasl_ptam runs at 20Hz on an ATOM 1.6GHz single core computer, for first experiments, we suggest to use a computer board with higher computation power. In general, the faster ethzasl_ptam runs, the more robust it is because of the larger image overlap, more accurate motion model and faster key-frame creation (the framerate on an ATOM 1.6GHz single core computer drops to 13Hz if a key-frame is created on each new camera frame).
We suggest to use a Mastermind Core2Duo 1.86GHz dual core computer. On this architecture, ethzasl_ptam uses about 60% of one core leaving sufficient computation power for other algorithms.

Camera: We strongly suggest to use a global shutter monochrome camera. We used the mvBlueFOX-MLC200wG with the MV-O-SMOUNT 2.8 BM2820 lens having a focal length of 2.8mm. You can find these parts at MatrixVision. In our setup we mounted the camera downward looking. Other configurations are possible, bearing in mind that the scene at which the camera looks needs to be feature rich at all times. Calibrate your camera using the cameracalibrator.

Monitor Station: Even though the framework presented in this tutorial runs completely onboard the MAV, we suggest to have a separate ROS-ready PC where you can monitor the health of the system using the remote tools described in this tutorial and the plotting scripts of ethzasl_sensor_fusion.

Setting up the Software

Flash the AscTec platform with the latest firmware.

Install ethzasl_ptam, ethzasl_sensor_fusion and asctec_mav_framework on your onboard computer (and the monitoring PC for visualization). In the sensor fusion stack, you may use the provided pose_sensor module.

Parameters for the Vision Part

You may use the default parameters provided for ethzasl_ptam if you use the suggested hardware. Make sure that gui=False. If you use another hardware, make sure:

  • the image resolution is set correctly,
  • the maximum number of retained key-frames (KFs) suits the computation power of the mounted computer. We suggest 5KFs for the mentioned ATOM board, and 15KFs for the Core2Duo board.

Make sure you remap the image topic correctly to your camera node output image.

Parameters for the Sensor-Fusion Part

For the pose_sensor module make sure you remap all topics correctly (see sample rxgraph below). Furthermore, using ethzasl_ptam, you need to set the following parameters correctly:

  • measurement_world_sensor = False // PTAM yields the world pose w.r.t. the camera pose

  • use_fixed_covariance = False // we use the covariance provided in the PTAM pose message

  • data_playback = False // this is only for replay bag-files

  • init/q_ci and init/p_ci must represent an initial value for the transformation of the camera w.r.t. IMU (hand measured is sufficient)

  • the provided IMU noise values are for an AscTec Firefly platform. You may augment the noise values for a Pelican (and Hummingbird) platform. If the framework is malfunctioning, leave the values as they are and do not tune them - the error is most likely at another place.

  • you may need to adapt the delay parameter slightly for better performance (this is the time needed to capture the image until the time-stamping on the computer board, that is, before the image is processed in ethzasl_ptam)

  • do not change the other parameters

Parameters for the AscTec HighLevel (HL) Interface

As described in this paper we do the EKF prediction step on the HL controller of the MAV and do the position control based on this pose estimate. Thus, we have to set the asctec_hl_interface in the correct mode. You may download this sample-file to use the suggested parameters:

  • packet_rate_ekf_state sends the predicted states from the HL to the onboard computer, this should be as high as possible and in all cases faster than the update-sensor.

  • too many packages sent via the serial port from the HL to the onboard computer will jam the connection. Use low rates on all other packages in order to increase packet_rate_ekf_state. The values in the sample file are close to the maximal number of packages allowed in order to not jam the channel.

  • state_estimation = HighLevel_EKF and position_control=HighLevel are mandatory settings to use the external built-in state propagation on the MAV's ARM7 high-level controller.

  • battery_warning needs to be adjusted to your battery type. The value is in voltages.

Sample rxgraph of the fully connected sFly framework using external state prediction (i.e. prediction on the ARM7 high-level controller of the MAV) and the update-sensor module ethzasl_ptam.

Launching the System

Before flying we strongly suggest you do first hand-held experiments.

Pre-Flight Tests

Write all the above discussed settings and message re-mappings in parameter files and launch files respectively. Launch the system such that a rxgraph as depicted above appears. You may do this via ssh remote-login to the onboard computer or by writing the launch files accordingly and launch them from the monitoring PC. Do not start the MAV motors yet.

On the monitoring PC:

  • Launch a dynamic reconfigure gui and check the parameters of ethzasl_ptam, ethzasl_sensor_fusion and asctec_mav_framework.

  • Launch the remote_ptam node on the monitoring PC. Verify if you see the image. Initialize ethzasl_ptam and verify if the created map is robust under some excitations.

  • Launch some monitoring plots - at least rosrun ssf_core plot_relevant

  • Switch to /pose_sensor in the reconfigure gui. Check the height ethzasl_ptam is measuring and the metric height at which you are holding the MAV. Set the scale_init value to λ = heightptam / heightmetric. λ needs to be initialized very precisely; around +/- 5% of its true value. Otherwise the filter might end up in a local minimum.

  • Hit init_filter and start exciting the MAV such that the states converge. It is a good idea keep the MAV in a small area in order to prevent new key-frames and to excite the MAV in accelerations and angular velocities to make the states converge.

  • Check the monitoring plots and the map grid while exciting the MAV

Once the states converged, you can move around freely and see how the framework behaves in different situations.

Flight Tests

Always be careful when working with the running/flying MAV.
Always have a well trained safety pilot who can take over the MAV with his RC at any moment.
Never fly over or close to people or infrastructure that can be damaged.

The initialization procedure is the same as described above. Extremely well trained pilots may take off manually while a second person performs the initialization on the monitoring PC. Once the states are converged the safety pilot may switch from manual control to autonomous flight on his RC. In this mode, first check the hovering performance, then the RC can be used to set body velocities as described in this tutorial.

Failure Modes

When working with vision as the only sensor responsible for keeping the MAV airborne, many things can and will go wrong. The safety pilot must be prepared at all times to take over the MAV from any situation - no matter how robust the framework my look like at a given moment

In the following, we list some things that can suddenly happen and may break our framework. When ethzasl_ptam loses the map, the MAV will slowly drift because of the IMU integration. If ethzasl_ptam mismatches the map, anything can happen since the framework receives "valid" visual pose updates.

Changing Lighting Conditions

Even though humans may not notice it, in a foggy or partly cloudy day the sun suddenly "shines" stronger or weaker. This causes over- or under-saturation. Or, if you use adaptive shutter speeds, the MAV might fly over a bright spot (reflecting metal) and the whole image turns black because of this one bright spot. For this reason, we always used fix shutter speeds (or adaptive within very small limits) and observe well the camera image on the monitoring PC.

Low Texture Areas

When the sun shines, our framework runs well over grass (because the small shadows the grass throws on itself). If the sun fades, the grass becomes highly uniform and all features are suddenly lost.
The same happens when flying over a cluttered area and suddenly crossing a broad paved street at high altitude.

Moving World

The inertial measurements are with respect to a world-fix reference frame. The EKF uses these inertial readings to estimate - among other states - the visual scale factor (in very simple terms: integrating twice the accelerometers yields a metric position which can be compared to the non-metric visual position). If the world frame moves (e.g. the map is based on features on a moving car roof), the inertial readings relative to a world-fix frame are contradictory to the car movement and lead to wrong scale estimates. Hence the map features need to be static.

Wiki: ethzasl_sensor_fusion/Tutorials/sfly_framework (last edited 2012-10-08 10:42:34 by stephanweiss)