https://www.trossenrobotics.com/Shared/Images/Product/LoCoBot-PyRobot/Img0024.jpg

LoCoBot

The LoCoBot is a mobile manipulator from Carnegie Mellon University and designed to run Facebook AI's! PyRobot. PyRobot is an open source, lightweight, high-level interface on top of the robot operating system (ROS). It provides a consistent set of hardware-independent mid-level APIs to control different robots. PyRobot abstracts away details about low-level controllers and interprocess communication so users can focus on building high-level AI robotics applications.

The LoCoBot is built on the Yujin Robot Kobuki Base (YMR-K01-W1) and powered by the Intel NUC NUC7i5BNH Mini PC. The platform also comes with the WidowX 200 Mobile Manipulator and an Intel® RealSense™ Depth Camera D435. LoCoBot comes partially assembled and is designed for users to be up and running as soon as possible.

Assembly Instructions: http://support.interbotix.com/html/assembly/locobot/index.html#locobot-assembly

Github: https://github.com/Interbotix/pyrobot

For Technical questions please contact trsupport@trossenrobotics.com .

https://camo.githubusercontent.com/0d4e192ee9bd045ee613a27ccd8544b31bca08a2/68747470733a2f2f7468756d62732e6766796361742e636f6d2f4669636b6c655370656564794368696d6e657973776966742d73697a655f726573747269637465642e676966 https://camo.githubusercontent.com/26ae3123b6780593ffcc51b803b8de571cf5d6f7/68747470733a2f2f7468756d62732e6766796361742e636f6d2f46696e69736865645765697264436f636b65727370616e69656c2d73697a655f726573747269637465642e676966

Features

  • Intel RealSense 3D Camera D435

  • Accelerometer/Gyro/Compass
  • Edge Detection & Bumper sensors

  • Kobuki Mobile Base
  • WidowX 200 Mobile Robot Arm
  • INTEL NUC - NUC7i5BNH
  • Intel Dual Core i5-7260U 2.20 GHz Upto 3.4GHz
  • 8GB Ram
  • 250GB SSD
  • 802.11AC WiFi / Bluetooth 4.0

Arm Specifications

  • 5 Degrees of Freedom
  • 550mm Reach
  • ~1mm accuracy
  • 200g Working Payload

Examples:

Workspace Setup to Run Examples

  1. After ensuring that the Kobuki base, battery, and computer are all turned on, wait until the ‘locobot-hotspot’ network appears on your personal computer.
  2. Connect to the network mentioned above (password is ‘locobot-hotspot’) and SSH into the LoCoBot computer by typing ‘ssh -X locobot@locobot.local ’. Once logged in, type ‘gnome-terminal &’ to bring up the terminal screen from the LoCoBot computer onto your personal computer. This way, you can open as many terminals as you want without having to SSH each time.

  3. In one terminal, type ‘roslaunch locobot_control main.launch use_base:=true use_arm:=true use_camera:=true use_rviz:=false. Some of the examples also require running the launch file with ‘use_vslam:=true’. In that case, Cntrl-C the launch file and run it with the additional argument.
  4. In a second terminal, type ‘source ~/pyenv_pyrobot/bin/activate’ and hit Enter. Then navigate to the low_cost_ws/src/pyrobot/examples/locobot directory, and run any of the examples below. Note that additional documentation for the examples can be found at https://pyrobot.org/docs/overview under ‘Locobot Examples’.

base_position_control.py

Command the mobile base to smoothly travel to a desired 2D pose relative to its current position and orientation. For example, the command below will move the robot to a position 1 meter along the X and Y axes with an orientation of 1.57 radians relative to its current pose.

python navigation/base_position_control.py --base_planner none --base_controller ilqr --smooth --close_loop --relative_position 1.,1.,1.57 --botname locobot

base_position_control_with_map.py

Using SLAM, the robot moves to a desired 2D pose relative to its current position. SLAM creates a map as the robot moves and helps it avoid obstacles as it travels to its destination. Note that you must run the launch file with ‘use_vslam’ for this to work.

python navigation/base_position_control_with_map.py

base_trajectory_tracking.py

Command the mobile base to smoothly follow one of two types of trajectories (circular or ‘S’ shaped). For example, the command below will move the robot along an ‘S’ shaped trajectory.

python navigation/base_trajectory_tracking.py --noclose_loop --type twocircles --botname locobot

base_veloticy_control.py

Command the mobile base to travel at the specified linear [m/s] and angular [rad/s] velocities for a specified amount of time [s]. For example, the command below will move the robot with a linear speed of 0.2 m/s and an angular speed of 0.5 rad/s for 5 seconds.

python navigation/base_velocity_control.py --base_planner none --base_controller ilqr --botname locobot --linear_speed 0.2 --angular_speed 0.5 --duration 5.0

camera_control.py

Command the pan-and-tilt motors manipulating the camera to a series of 5 poses, showing a color and depth image at each pose.

python navigation/camera_control.py

camera_image.py

Capture a color and depth image with the RealSense camera at its current pose and display them to the user.

python navigation/camera_image.py

slam_get_pose.py

Using visual SLAM, the camera’s pose is calculated and displayed to the user. The robot then moves 0.1 m/s for 1 second at which point the camera’s pose is recalculated and displayed to the user. Note that before executing this script, you must run the launch file with the ‘use_vslam’ argument set to True. Also note that this script works best if you are working directly on the LoCoBot computer instead of being SSH’d into it.

python navigation/slam_get_pose.py

vis_3d_map.py

Using visual SLAM, a 3D construction of the world seen by the camera is mapped via the Open3D application. Note that before executing this script, you must run the launch file with the ‘use_vslam’ argument set to True. Also note that this script works best if you are working directly on the LoCoBot computer instead of being SSH’d into it.

python navigation/vis_3d_map.py

Manipulation:

ee_pose_control.py

Commands the end-effector of the robot arm to two desired poses before commanding the arm to go to its ‘Home’ pose (where all joints are set to 0 radians). Orientation is specified as either a rotation matrix or quaternion.

python manipulation/ee_pose_control.py

ee_pose_pitch_control.py

Commands the end-effector of the robot arm to two desired poses before commanding the arm to go to its ‘Home’ pose (where all joints are set to 0 radians). Orientation is specified using Euler angles.

python manipulation/ee_pose_pitch_control.py

ee_xyz_control.py

Commands the end-effector of the robot arm to move to a desired xyz position (relative to its current position) while maintaining its current orientation. In the example below, the end-effector moves -0.15 meters along the Z axis.

python manipulation/ee_xyz_control.py

gripper_control.py

Opens, closes, then reopens the gripper.

python manipulation/gripper_control.py

joint_position_control.py

Commands the robot arm joints to two sets of joint values [rad] before commanding the arm to go to its ‘Home’ pose (where all joints are set to 0 radians).

python manipulation/joint_position_control.py

joint_torque_control.py

Commands the first four of the five joints on the robot arm to the specified torque values. The default values are very low so make sure the arm is in its rest pose before executing this script. Also note that you must run the launch file with ‘torque_control’ set to True before running this script as well.

python manipulation/joint_torque_control.py

moveit_planning.py

Using a specified MoveIt planner, the script commands two sets of joint values to the robot arm before commanding it to its ‘Home’ pose (where all joints are set to 0 radians).

python manipulation/moveit_planning.py

pushing.py

With a small object placed within reach of the robot arm, the camera calculates the pose of the object and then uses the gripper part of the arm to push the object a few centimeters. An example command is seen below.

python manipulation/pushing.py --floor_height=0.03 --gripper_len=0.12

realtime_point_cloud.py

Before running the ‘pushing.py’ script, run this script to determine the best value to use for the ‘floor_height’ argument. This script creates a 3D point cloud via the Open3D application and filters out data based on the above mentioned argument. ‘floor_height’ should be large enough so that data points from the floor are filtered out but small enough so that the object that will be pushed is not filtered out. An example command is seen below.

python manipulation/realtime_point_cloud.py --floor_height=0.03

Report a Bug

Use GitHub to report bugs or submit feature requests. [View active issues]

Wiki: locobot (last edited 2020-02-04 22:16:07 by Rick)