Only released in EOL distros:  

turtlebot_apps: pano_core | pano_py | pano_ros | turtlebot_actions | turtlebot_calibration | turtlebot_core_apps | turtlebot_follower | turtlebot_navigation | turtlebot_panorama | turtlebot_teleop

Package Summary

The ros frontend to the pano subsystem.

turtlebot_apps: turtlebot_actions | turtlebot_calibration | turtlebot_follower | turtlebot_navigation | turtlebot_rapps

Package Summary

The ros frontend to the pano subsystem.

panorama capture with ros

pano_ros exposes a set of tools and small ros nodes that enables acquiring data that is convenient for the pano_core stitching library.

Through the use of actionlib, pano_ros may be used to capture bagfiles that may then be stitched asynchronously or completely offline.

Pano quicky (using kinect and openni_camera)

roslaunch openni_camera openni_kinect.launch 

Spin up the capture server.

rosrun pano_ros capture_server.py camera:=/camera/rgb

Monitor the stitch output with image_view

rosrun image_view image_view image:=/pano_capture/stitch

Try the time interval based capture client. Slowly pan around and a photo will be taken every 1 seconds. Stops after 10 photos.

rosrun pano_ros capture_client.py `pwd`/pano01.bag 10 1

You should see the stitch appear in the image_view window. Click on it to save.

Email the stitch to ros@botview.appspotmail.com as an attachment. You will receive a link like the following: http://botview.appspot.com/viewer/pan0.swf?panoSrc=/media/3736001&FOV=52

capture_server

The capture server implements an action interface for writing camera images to a bag file. This bag file can then be passed to tools like stitch_client.py. The action server starts the capture process. You can then call the pano_capture/snap to capture another image to the bag, and you can call pano_capture/stop topics to stop the capture process. The action will complete when stop is called and will return the name of the bag file.

NOTE: the capture process may capture less images than requested via snap. This generally means that snap is being called faster than synchronized images can be acquired.

rosrun pano_ros capture_server.py camera:=/kinect/rgb

Testing in rosh

You'll need two rosh interpreters.

Interpreter 1:

  • In [1]: actions.pano_capture('camera/rgb')

Interpreter 2 (run snap as many times as you like):

  • In [1]: topics.pano_capture.snap()
    
    In [2]: topics.pano_capture.snap()
    
    In [3]: topics.pano_capture.stop()

In the Interpreter 1 window, you should see:

  • In [1]: actions.pano_capture('camera/rgb')
    Out[1]: 
    pano_id: 1292382284
    n_captures: 6
    bag_filename: /tmp/pano_1292382284.bag

capture_client

The sample client takes the specified number of images, one per second.

rosrun pano_ros capture_client.py `pwd`/pano.bag 30

This will save all needed data for stitching in `pwd`/pano.bag. Notice the use of `pwd`/ as the bag name is absolute.

The following implements the simple time based capture client

   1 import roslib; roslib.load_manifest('pano_ros')
   2 import rospy
   3 import sys
   4 from pano_ros.msg import PanoCaptureResult
   5 from pano_ros.capture_client_interface import CaptureClientInterface
   6         
   7         
   8 class TimedCapture(object):
   9     """
  10     Uses the CaptureClientInterface to take photos 
  11     regularly at a time interval and will snap until 
  12     the desired number of frames are taken
  13     """
  14     def __init__(self):
  15         #CaptureClientInterface does all of the setup
  16         #for connecting to the pano_capture server
  17         self._capture = CaptureClientInterface()
  18             
  19     def capture(self,bag_filename, total_captures = 20, time_interval = 1):
  20         """
  21         This is a timed interval capture.
  22         total_captures the number of images to take
  23         time_interval attempt to take an image at 
  24                       this time interval, specified in seconds
  25         """
  26         capture = self._capture
  27         #start up the capture client, giving absolute path to a 
  28         #bag where the data will be stored
  29         capture.start(bag_filename)
  30         result = PanoCaptureResult()
  31         try:     
  32             #we'll just capture total_captures images, 
  33             #1 every time_interval seconds
  34             while ( capture.n_captures < total_captures 
  35                    and not rospy.is_shutdown() ):
  36                 #request a snap
  37                 capture.snap()   
  38                 #sleep for 1 seconds to allow the camera to be moved
  39                 rospy.sleep(time_interval)
  40         finally:
  41             tresult = capture.stop()
  42             result = tresult if tresult else result
  43         #this will return the PanoCapture result message
  44         return result               

Bag Stitcher

The bag_stitcher.py is an actionlib server that does bag stitching, just start her up...

rosrun pano_ros bag_stitcher.py

Bag Stitcher Client

Now to stitch, run a client that uses actionlib to stitch with the bag_stitcher A very simple one is implemented already.

rosrun pano_ros stitch_client.py test_pano.bag stitch.jpg
eog stitch.jpg

This will take the bag generated by the capture client and turn it into a beautiful spherical panorama.

Try emailing the pano as an attachment to ros@botview.appspotmail.com if you would like to see it projected onto a 3d sphere in your web browser.

Not supported in fuerte.

To be documented.

Wiki: pano_ros (last edited 2012-12-29 17:15:01 by DanielStonier)