Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags. |
All of the deprecated tutorials in one place.
Description: All of the text from the deprecated tutorials.Tutorial Level:
Contents
Working with video
To play a single video, use stereo_playback.launch. This will run rosplay with the arguments in the environment variable PBARGS and display the topic with name defined by PBTOPIC. Example invocation:
setvar PBARGS /u/prdata/person_data/2009-08-13b/narrow_stereo__2009-08-13-13-25-26-topic.bag setvar PBTOPIC /narrow_stereo/raw_stereo_muxed roslaunch stereo_playback.launch
Here "setvar" is the following bash function, which is nice because it allows you to tab complete the filename, or to use quotes around the variables value to pass multiple arguments to rosplay:
function setvar { if [[ "$3" != "" ]] then echo "setvar only takes 2 arguments" else export $1=""; export setenvprepend=""; for arg in $2; do export $1+="${setvarprepend}"; export $1+=$arg; setvarprepend=" "; done fi }
To play multiple videos in parallel, use scripts/stereo_multi_playback.py. Here is its usage statement:
Reads in command line arguments of the form [bag file] [optional arguments to rosplay] [topic to play from that bag] repeated for as many different channels of video as you want to play at once (the optional arguments may appear before the bag file in the case of the first channel, and in all cases may also appear in between the bag file and the topic to play. They are identified by beginning with a -; anything not beginning with a - is assumed to be a bag file or a topic) This does obviously assume that all of your separate topics are in separate bags... that's how mine are recorded since the two cameras are hooked to different machines Note: all rosplay arguments will get aggregrated into the same call to rosplay
Example invocation:
scripts/stereo_multi_playback.py /u/prdata/person_data/2009-08-11b/wide_stereo__2009-08-11-11-28-27-topic.bag /wide_stereo/raw_stereo_muxed /u/prdata/person_data/2009-08-11b/narrow_stereo__2009-08-11-11-28-29-topic.bag -r 0.5 /narrow_stereo/raw_stereo_muxed
(Here the -r 0.5 tells rosplay to play the video at half speed, this makes it less likely that playback performance issues will artificially desynchronize the videos)
Checking Bag Data
To view the images and laser scans, run person_data/launch/check_bag.launch
- Change the name of your bag file in the launch file.
- You should see:
- Left and right narrow stereo images, rectified.
- Left and right wide stereo images, rectified and in color.
Rviz with a tilting scan, a base laser, and a map (you may need to add these displays and use map as the fixed frame.)
Also make sure that your bag contains the messages /laser_tilt_controller/laser_scanner_signal and /mechanism_state.
DEPRECATED
1. Use scripts/plot_bags.py to make sure that you aren't have trouble with one set of sensors dropping out
2. Use stereo_playback.launch to watch each of the videos individually and make sure that the exposure settings, etc. are sane. You may need to branch prX.launch and make a prX-person.launch that uses more suitable parameters to dcam. (For instance, for a while narrow_stereo was set to a very low exposure value for use with the terminator light)
3. Use scripts/stereo_multi_playback.py to watch the videos simultaneously. You may need to use -r 0.5 to make sure your machine can keep up. Make sure that the videos are synchronized. High system load can cause the videos to become unsynchronized if the timestamps start to flow nonlinearly on one machine or the other. It is worth recording videos of people clapping or other scenarios where objects collide instantaneously so that you can verify that the collisions happen at the same time in each video sequence.
4. Use rviz to look at the tilt laser scans and the odom_combined location and make sure they look good.
Running Tests
1. If necessary, follow the instructions in http://www.ros.org/wiki/person_data/Tutorials/collect_data to set all of the config files, environment variables, etc, since the tests require the same configuration as the regular data collection
2. Bring down the core if you have one up. Nothing ROS-related should be running. The tests will make sure they can bring up and bring down the core just as if they were a human user trying to do an end-to-end data collection run.
3. Run scripts/test.sh from machine 2. It will print out a list of failed tests if any, and its process return value will be 0 if and only if all tests pass.
Currently existing tests:
test_components/test_joystick.sh: exits with status 0 and only if the tests pass. Brings up the core, brings up prX.launch, runs scripts/collect.data.sh. Checks that all the correct bag files were created and are growing. Simulates the "stop recording" button. Checks that the bags have stopped growing. Simulates the "start recording" button. (Hopefully before my last day I will add the following: checks that each of the bags has started growing again). Brings down the entire setup. (Hopefully before my last day I will add the following: all .active bags turned into .bag files. All bag files contain the correct topics)
Using collect_data.sh
New Version
This tutorial is on using collect_data.sh to gather data for the person detection dataset.
Currently, collect_data.sh assumes you are using the two-machine version of PR2 introduce in late summer '09.
collect_data.sh records narrow and wide stereo video, TF frames (including localization), mechanism state, odometry, base scan, tilting laser and the tilting laser scanner signal.
This tutorial assumes that you have set up your .bashrc on the robot to include:
export ROBOT=prX
export ROS_MASTER_URI=http://prX1:11311
Compile offboard
- stereo_image_proc
- rviz
- image_view
Compile onboard
To record
roslaunch pre.launch
Set a unique folder name in person_data/config/machine2bagdir.sh. <location>-<mm>-<dd>-<yyyy>-s<#>
person_data/scripts/collect_data.sh
person_data/bin/joy_record
Localize on the map
- Get 30 seconds of stationary scene for future calibration.
Push and hold the bottom-most left shoulder button to enable recording controls. Then the circle button starts recording and the square button stops recording. You should see console output indicating the recording state changes from the joy_record node.
The resulting bag file will go in person_data and be called person_data.bag.
After recording
Copy data from the robot to /wg/osx/person_data
The map is an image whose location can be found in <your_bag_location>/map_server.xml. Copy the image to <your_bag_location> and edit <your_bag_location>/map_server.xml to include the new image location.
Topics
The following topics should be recorded:
- /base_scan
- /tf
- /narrow_stereo/left/image_raw
- /narrow_stereo/left/camera_info
- /narrow_stereo/right/image_raw
- /narrow_stereo/right/camera_info
- /wide_stereo/left/image_raw
- /wide_stereo/left/camera_info
- /wide_stereo/right/image_raw
- /wide_stereo/right/camera_info
- /tilt_scan
- /laser_tilt_controller/laser_scanner_signal
- /mechanism_state
The map is an image which can be played back using <your_bag_location>/map_server.xml.
TO DO
- Align stereo point cloud and the laser point cloud manually for now. Perhaps collect data so that they can be properly calibrated later.
- Calibration check for the stereo??? NEED TO REWRITE THE CAMERA_OFFSETTER TO WORK ON POINT CLOUDS AND REWRITE THE TF FRAMES FOR THE CAMERAS DIRECTLY.
- Extract laser snapshots corresponding to images (last x seconds corresponding to an image time stamp). (To find x, look at pr2_mechanism_controllers/scripts/send_laser_traj_cmd_ms2.py.)
- Scripts for evaluating results.
Instructions for data collectors
The robot operator will do the following:
- Bring up the robot as described HERE (TODO)
- Use the joystick to drive the robot as follows
- If there are no people in the current room, the robot operator will drive to another room / hallway until people are visible (the operator should attempt to cover as much of the building as possible during the course of their recording session)
- If there are people in the location:
- The operator should attempt to turn on recording before they become visible to the robot
- The operator should still turn on recording to capture fast-moving people even if they are unable to turn on recording before the person is visible to the robot
- The operator will be instructed to do one of the following before beginning their session:
- Keep the robot stationary once a person is in the field of view
- Rotate the robot to track the person, but keep the base stationary
- Follow the person with the robot, rotating it to keep the person in view and translating the base to follow their path
- Continue immediately to the next room without reacting to the presence of people in any way
- The operator should photograph a specific person for no more than 30 seconds at a time (if the person is seated and stationary as few as 10 seconds is acceptable)
- A person who is seated should only be photographed once in a specific location (i.e. do not photograph a person upon entering a room if they are seated at the same console as they were the last time you were in the room)
- Once everyone in a room has been photographed the operator shall cease recording and drive the robot to the next location unless they have reason to believe that more people will soon be entering the location, in which case they may remain for up to 3 minutes)
Using Mech Turk to Generate Labels
Information about the (internal) process of setting up a Mechanical Turk interface for labeling can be found here.
Converting to standard formats
Converting uint8 images to png: person_data/launch/save_images.launch
Runs: person_data/scripts/save_images.py
Will save all of the images as pngs in <bag_dir>/<topic_with_underscores>/png/<time>.png
Old Version
This tutorial is on using collect_data.sh to gather data.
Currently, collect_data.sh assumes you are using the two-machine version of PR2 introduce in late summer '09.
collect_data.sh records narrow and wide stereo video, TF frames (including localization), mechanism state, odometry, base scan, and tilting laser.
This tutorial assumes that you have set up your .bashrc on the robot to include:
export ROBOT=prX
export ROS_MASTER_URI=http://prX2:11311
where X=e,f, or g
1. On the robot, run "rosmake person_data"
DEPRECATED: 2. On the robot, run
roscd person_data; bash make_runtime_deps.sh
(After you have run this once, you will be able to run this as ./make_runtime_deps.sh instead)
3. On the PC, run "rosmake rviz" if it has not been built already. Be sure that the version of rviz on your PC is compatible with your version of the nav stack on the robot.
4.Possibly set up config/headpan.txt and config/headtilt.txt. They each contain a single number with no endline. They control the pan and tilt that the head will be locked to during data collection. Currently both are set to 0, and it is unlikely that you will need to change this. It is important not to have an endline because their contents are written out as command line arguments using cat.
5. Possibly set up config/machine2bagdir.sh. It is a script that prints out the location where bag files should be stored. This file is a script because filenames using "~" were not working with rosrecord for unknown reasons. Currently it returns /pr/2/whoami/bags. This makes the assumption that you are using the same username you will use on pre during data collection, and that all the home folders are in /pr/2/. It would be possible to use pushd ~; pwd; popd to get the home folder if we quit using the /pr/2 directory to store home folders, but this assumes that that machine2bagdir.sh is run on the robot, rather than on a PC. NOTE: eventually PRE will support two removeable drives, and we will want to modify the data collector script to write to more than one drive. I'm not certain if each machine willl only have access to one of those drives or not. When these drives become available, the mapping of recorded topics to bags will need to change, either based on which machine can access which drive, or based on load balancing across the drives as well as possible. For now though, Eric says we should just record to the home directory on machine 2.
6. Bring up a core on prX2.
7. Bring up prX.launch. Check the dashboard for error messages. In particular, NTP offset errors/warnings will ruin the tilt laser data.
a) Temporary: Run the cameras as in BlaiseGassend/WGE100StereoCamera
8. Run scripts/collect_data.sh. The arms should tuck and the tilt laser should start tilting. If the arms collide and oppose each other, use Ctrl-C to kill the script and then run it again.
9. The robot should now be recording (but don't drive off yet, you still need to localize it). The only way to monitor whether it is recording is to use du in the bags directory. It's a good idea to check periodically that it is still recording.
The controls are as follows:
Left upper trigger: hold this down to enable the movement controls
Left lower trigger: hold this down (without holding down the upper trigger) to enable the recording controls
Movement controls: Left joystick steers, right joystick translates. All other controls are disabled to ensure that head height/pose are consistent throughout the dataset.
Recording controls: Use the symbol pad on the right, above the right joystick. Square=stop recording, circle=start recording. Button presses may take up to 5 seconds to register. During this time subsequent button presses will not be detected.
If the robot doesn't respond to the joystick at all, you may need to press the button with the Playstation logo in between the two joysticks. I think this activates the wireless.
10. Localize the robot. Launch rviz using this file:
ros/ros-pkg/demos/2d_nav_pr2/rviz/rviz_move_base.launch
Use rviz to set the initial pose of the robot. It's probably a good idea to drive the robot around a little and make sure it's working. In the future nav_view may provider a lighter-weight alternative to rviz, but as of this writing it doesn't work.
Information for maintainers
Read the other tutorials before reading this.
First off, notes on things that are known to be unimplemented or wrong: 1. Tests
- -as mentioned in the tutorial on running the tests, a few features remain unimplemented -there are still some issues with bringing down everything. in particular, collect_data.sh really doesn't want to be killed.
2. Bag file location
- -as mentioned in the tutorial on running collect_data.sh, the bag file location will need to change after the removable drives on pre/prg become available. Right now everything is sent to the home directory on machine 2.
3. Calibration
- -There is a separate setup for gathering calibration data (it doesn't run the nav stack, for instance) that hasn't been maintained nearly as much as the main data collection stuff. I have no idea if it works. Mostly it is a branch of the main data collection stuff with some settings changed. This is an instance of it being a pain that launch files aren't really an object oriented programming system. -No one has ever calibrated the cameras to the tilt laser, so no one knows if this setup captures good enough data to do that. Vijay thinks it will be necessary to slow the laser down.
Brief description of the components of the data collection setup:
scripts/collect_data.sh: Creates the necessary bag files and launches data_collector_components/person_data.launch
data_collector_components/person_data.launch: This is where all the real functionality is brought up. Launches joylistener, headhack, tuckarm, data_collector_record.lauch, truly_passive.launch, data_collector_components/2dnav_pr2.launch and data_collector_components/teleop_joystick.launch
joylistener: compiles to bin/joylistener, source is src/joylistener.cpp. Lists to topic /joy. When the start/stop recording buttons are pressed it runs
data_collector_components/startRecording.sh or stopRecording.sh
startRecording.sh/stopRecording.sh: all these do is run startRecording.launch and stopRecording.launch
data_collector_record.launch: sets up all of the rosrecord nodes. Rather than directly recording the relevant topics, they record muxed versions of them. The muxes can be used to hide a topic when it shouldn't be recorded.
startRecording/stopRecording.launch: Launches the necessary switch_muxes to change the mux nodes controlling all of the recorded topics. Rosrecord is actually always running, these launch files just control whether messages reach it or not.
data_collector_components/teleop_joystick.launch: this is a branch of pr2_alpha/teleop_ps3.launch. (Yet another place where it would be nice if launch files could actually interact somehow) It has been hacked in two ways: 1. the head controllers have been removed; if left in they will fight with headhack.py. 2. all buttons relating to the head or torso have been remapped to button 500, which doesn't exist and therefore can't be pressed. This is so that the user can't accidentally mess up the head height or pose, and the head height/pose will be consistent throughout the dataset.
data_collector_components/2dnav_pr2.launch: this is a branch of 2dnav_pr2/2dnav_pr2.launch. It has been hacked not to include teleop, since that would fight with my own teleop.
base_odom.xml: This is sub-file of 2dnav_pr2.launch which has had the teleop removed.
Any files with "dummy" in their name are used for simulating the recording system offline (i.e., without a robot). They don't simulate the whole recording setup, but only simulate the joystick record/pause recording functionality.
Brief description of the components of the test system:
scripts/test.sh: This is the main entry point to the test system. It simply calls other test scripts and checks their exit status. If any of them have a nonzero exit status, it will too. It also prints the names of the tests that fail. Currently the only test is test_components/test_joystick.sh
test_components/test_joystick.sh: See the tutorial on running the tests to get an idea of its functionality. It works by calling roscore, roslaunch, and collect_data.sh, and storing the output of ls/du on the bags folder. Files created after thte script started running have their sizes checked to make sure they're behaving properly. Uses sizecheck.py to do some of the checks and sequential_kill.sh to bring everything down. An important aspect of this test is that it will trap ctrl-C and respond by killing everything it brought up. This is so you can stop the robot from mashing its arms together if tuckarm goes awry. Without the signal trapping, ctrl-Cing the script would leave up all the nodes it brought up and and that point your best bet would be to run killall -u whoami on machine 1 and machine 2.
test_components/sequential_kill.sh: Given a list of pids, goes through and kills them in order, waiting for each one to die before proceeding to the next. Uses SIGINT to kill so that bash scripts can trap the signal and kill things that they spawned. The trapping and killing is necessary so that the kill signal actually propagates to the ROS nodes that were brought up. The waiting is so that you don't, for example, kill the core while the roslaunches are still up, because that results in roslaunch hanging instead of dying.
Other functionality: stereo_playback.launch: works by reading in environment variables. If you launch it without defining them it will yell at you and tell you to define them. See the tutorial on video playback for usage details. scripts/stereo_multi_playback.py: works by synthesizing a launch file and then running it. See the tutorial on video playback for usage details. scripts/chop_bag.py and plot_bag.py: both fairly simple-- work by using a python iterator that reads through the bag file. see tutorial on working with bag files for details.
Working with Bag Files
One useful script is scripts/plot_bags.py. It will generate a plot that shows when each topic was recorded over time. The results should show allow of the topics stopping and starting at the same time (when the joystick stop and start record buttons are pressed). If you see one that never turns off, it means that topic is not being affected by joystick control. If you see one that turns off when the others are still on, that probably means its node died.
Usage pattern:
./plot_bags.py <list of input bags> -o <output file (octave script)>
Example invocation:
scripts/plot_bags.py /u/prdata/person_data/2009-08-11a/*.bag -o myplot.m octave
Then, within octave:
myplot
(octave is free software that implements a slight modification of the matlab programming language. To get it run sudo apt-get octave)
Another useful script is chop_bags.py. This will divide the bag into a set of smaller bags. Within each smaller bag, all messages will be present at all times. (The definition of "present" is publishing at a certain rate, which is hard-coded to different values for different topics). Note that this means that if one of your topics goes down for 5 minutes, those 5 minutes will be excluded from the results.
Usage pattern:
./chop_bags.py <list of input bags> -o <output directory>
Example invocation:
./chop_bags.py /u/prdata/person_data/2009-08-11b/* -o /u/prdata/person_data/2009-08-11b/chopped
Here's what the filesystem looks like after that command is done:
goodfellow@bwi:/u/prdata/person_data/2009-08-11b$ ls chopped/ scene_0 scene_10 scene_12 scene_14 scene_16 scene_3 scene_5 scene_7 scene_9 scene_1 scene_11 scene_13 scene_15 scene_2 scene_4 scene_6 scene_8 goodfellow@bwi:/u/prdata/person_data/2009-08-11b$ ls chopped/scene_0 laser__2009-08-11-11-28-22-topic.bag state__2009-08-11-11-28-22-topic.bag mech_state_2009-08-11-11-28-22-topic.bag tf__2009-08-11-11-28-22-topic.bag narrow_stereo__2009-08-11-11-28-29-topic.bag wide_stereo__2009-08-11-11-28-27-topic.bag