(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Point Cloud Streaming from a Kinect

Description: This tutorial shows you how to stream and visualize a point cloud from a Kinect camera to the browser using ros3djs.

Keywords: ros3djs, web interface, javascript, Robot Web Tools, depthcloudjs, depthcloud

Tutorial Level: BEGINNER

DepthCloud Example

In this tutorial, we show how to stream and visualize a point cloud from a Kinect camera in the browser. To begin, check out the example code from ros3djs.

The HTML Code

<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />

<script type="text/javascript" src="../include/threejs/three.js"></script>
<script type="text/javascript" src="../include/EventEmitter2/eventemitter2.js"></script>
<script type="text/javascript" src="../include/roslibjs/roslib.js"></script>
<script type="text/javascript" src="../build/ros3d.js"></script>

<script type="text/javascript" type="text/javascript">
  /**
   * Setup all visualization elements when the page is loaded.
   */
  function init() {
    // Connect to ROS.
    var ros = new ROSLIB.Ros({
      url : 'ws://localhost:9090'
    });

    // Create the main viewer.
    var viewer = new ROS3D.Viewer({
      divID : 'viewer',
      width : 800,
      height : 600,
      antialias : true
    });

    // Setup a client to listen to TFs.
    var tfClient = new ROSLIB.TFClient({
      ros : ros,
      angularThres : 0.01,
      transThres : 0.01,
      rate : 10.0,
      fixedFrame : '/camera_link'
    });

    // Setup Kinect DepthCloud stream
    depthCloud = new ROS3D.DepthCloud({
      url : 'http://'+window.location.hostname + ':9999/streams/depthcloud_encoded.webm',
      f : 525.0
    });
    depthCloud.startStream();

    // Create Kinect scene node
    var kinectNode = new ROS3D.SceneNode({
      frameID : '/camera_rgb_optical_frame',
      tfClient : tfClient,
      object : depthCloud
    });
    viewer.scene.add(kinectNode);
  }
</script>
</head>

<body onload="init()">
  <h1>Simple DepthCloud Example</h1>
  <p>Run the following commands in the terminal then go to http://localhost:9999/examples/depthcloud.html*.</p>
  <ol>
    <li><tt>roscore</tt></li>
    <li><tt>roslaunch rosbridge_server rosbridge_websocket.launch</tt></li>
    <li><tt>rosrun tf2_web_republisher tf2_web_republisher</tt></li>
    <li><tt>roslaunch openni_launch openni.launch depth_registration:=true</tt></li>
    <li><tt>rosrun ros_web_video ros_web_video _port:=9999 _framerate:=15 _bitrate:=250000 _profile:=best _www_file_server:=true _wwwroot:=<b>/path/to/ros3djs/</b></tt></li>
    <li><tt>rosrun depthcloud_encoder depthcloud_encoder_node _depth:=/camera/depth_registered/image_rect _rgb:=/camera/rgb/image_rect_color</tt></li>
  </ol>
  <small>*Due to a bug in the current WebGL implementations, it is not possible to serve
  this file and the video stream from a different host or port number, so we need ros_web_video
  to serve the html file as well. If you use Apache, you can set it up to proxy port 9999 to a subdirectory.</small><br/>
  <div id="viewer"></div>
</body>
</html>

Code Explanation

Now that we have an example, let's look at each piece.

<script type="text/javascript" src="../include/threejs/three.js"></script>
<script type="text/javascript" src="../include/EventEmitter2/eventemitter2.js"></script>
<script type="text/javascript" src="../include/roslibjs/roslib.js"></script>
<script type="text/javascript" src="../build/ros3d.js"></script>

We first need to import all of the required JavaScript files. Here, we use the Robot Web Tools CDN.

    var ros = new ROSLIB.Ros({
      url : 'ws://localhost:9090'
    });

Next, we need to create a Ros node object to communicate with a rosbridge v2.0 server. In this example, the script will connect to localhost on the default port of 9090.

    var viewer = new ROS3D.Viewer({
      divID : 'viewer',
      width : 800,
      height : 600,
      antialias : true
    });

We then need to create a 3D viewer. We provide the dimensions as well as the HTML div where the viewer will be placed.

    var tfClient = new ROSLIB.TFClient({
      ros : ros,
      angularThres : 0.01,
      transThres : 0.01,
      rate : 10.0,
      fixedFrame : '/camera_link'
    });

We then create a TF client. This client will subscribe to changes in the TF tree and update the scene appropriately. We use this to rotate the optical frame of the Kinect camera, which has it's Z axis pointing forward, into the standard ROS convention where X points forward.

    depthCloud = new ROS3D.DepthCloud({
      url : 'http://'+window.location.hostname + ':9999/streams/depthcloud_encoded.webm',
      f : 525.0
    });
    depthCloud.startStream();

This section of code creates the DepthCloud object. By calling startStream, it will create a video buffer for the given URL and start rendering a point cloud as soon as data arrives.

    // Create Kinect scene node
    var kinectNode = new ROS3D.SceneNode({
      frameID : '/camera_rgb_optical_frame',
      tfClient : tfClient,
      object : depthCloud
    });

Finally, we need to add the DepthCloud to our scene and assign the right tf frame to it.

Running the Example

At this point we are ready to run the example. To do so, you will need to have rosbridge_server, tf2_web_republisher, openni_camera, ros_web_video, depth_image_proc and depthcloud_encoder installed. Refer to their respective Wiki pages for installation instructions, or simply install their latest builds from the Ubuntu repositories.

Simply launch the necessary nodes with the following:

   1 roscore
   2 roslaunch rosbridge_server rosbridge_websocket.launch
   3 rosrun tf2_web_republisher tf2_web_republisher
   4 roslaunch openni_launch openni.launch depth_registration:=true
   5 rosrun ros_web_video ros_web_video _port:=9999 _framerate:=15 _bitrate:=250000 _profile:=best www_file_server:=true _wwwroot:=<b>/path/to/ros3djs/
   6 rosrun nodelet nodelet standalone depth_image_proc/convert_metric image_raw:=/camera/depth_registered/image_raw image:=/camera/depth_registered/image_float
   7 rosrun depthcloud_encoder depthcloud_encoder_node _depth:=/camera/depth_registered/image_float _rgb:=/camera/rgb/image_rect_color

Finally, you are now ready to bring up your HTML page in a web browser.

Support

Please send bug reports to the GitHub Issue Tracker. Feel free to contact us at any point with questions and comments.

Wiki: ros3djs/Tutorials/Point Cloud Streaming from a Kinect (last edited 2014-09-15 07:26:17 by NilsBerg)