http://www.luxai.org/assets/qtrobot-together.png

LuxAI QTrobot

QTrobot, is a commercial available toddler-like humanoid robot built by LuxAI S.A. It is a socially engaging and interactive robot with a wide areas of application. QTrobot is currently being used for emotional training of children with autism, post-stroke rehabilitation and elderly cognitive and physical rehabilitation.

QTrobot Interface

The QTrobot (LuxAI) interface aims to facilitate accessing basic robot functionalities leveraging a set of user-friendly ROS interfaces. Each robot’s functionality can be accessed in blocking and non-blocking mode using ROS publish/subscribe and Service/Client interfaces. For the time being, the following interfaces have been implemented:

  • Robot Emotion : implements robot facial emotion

  • Robot Speech : implements robot text to speech functionality

  • Robot Audio : implement a simple player to play standard audio files

  • Robot Gesture : implements robot gesture control

  • Robot Behavior : implements more complex behaviors by combining the robot basic functionality

  • Robot Motor : implements robot motor controls using standard ros_control

  • Robot Setting : implements some basic setting of robot such as speaker volume control.

For each of above-mentioned functionalities, there are two different set of language-independent interfaces:

  1. A set of ROS Subscribers to allow non-blocking call to the interface
  2. A set of ROS Services for blocking and more sophisticated interfaces

Naming convention

All the QTRobot ROS interfaces have '/qt_robot/...' prefix in front of their names. The word follows the prefix is the name of the main service (e.g. '/qt_robot/speech/...'). Each interface may have more sub-services (methods) which come after the main service name. For example:

/qt_robot/speech/say    : implements say() method of QTrobot TTS
/qt_robot/speech/config : implements configure() method of QTrobot TTS

For user’s convenience we have given the same name to the service and subscriber for each QTrobot ROS interface. That means one can access, for example, the speech functionality using ROS service call or publish/subscribe via the same interface name (e.g. '/qt_robot/speech/say'). Please notice that some of the complex services (e.g. speech configuration for language, pitch,…) are accessible only via ROS service call.

Accessing QTrobot interface from bash

You can access each Robot functionality via its publish/subscribe or Service/Client interfaces. For example to use robot 'Speech' functionality you can try the following:

Using ROS Publisher

$ rostopic pub /qt_robot/speech/say std_msgs/String "data: 'I am QT'"

Using ROS Service

$ rosservice call /qt_robot/speech/say "message: 'I am QT.'"

Accessing QTrobot interface from a python script

[Non-blocking mode] The following example shows how to access QTrobot Speech functionality using ROS publish/subscribe method from python:

   1 import rospy
   2 from std_msgs.msg import String
   3 
   4 # create a publisher
   5 speechSay_pub = rospy.Publisher('/qt_robot/speech/say', String, queue_size=10)
   6 ...
   7 # publish a text message to TTS (non-blocking)
   8 speechSay_pub.publish("Hello! I am QT!")

[blocking mode] And the following example, re-implements the previous one using ROS Service/Client method from python:

   1 import rospy
   2 from std_msgs.msg import String
   3 from qt_robot_interface.srv import *
   4 
   5 # create a service clients
   6 speechSay = rospy.ServiceProxy('/qt_robot/speech/say', speech_say)
   7 ...
   8 # call the service (blocking)
   9 resp = speechSay("Hello! I am QT.")

/!\ Notice: All QTrobot service interfaces returns the status of the call upun success or failure. For example to check whether a call to a service was successful in python, you can check the return value resp.status.

List of available interfaces

Currently the following interfaces have been implemented:

Functionality

Interface prefix

Description

Speech

/qt_robot/speech/...

robot text to speech functionality

Audio

/qt_robot/audio/...

simple standard audio file player

Emotion

/qt_robot/emotion/...

robot facial emotion

Gesture

/qt_robot/gesture/...

robot gesture control

Behavior

/qt_robot/behavior/...

more complex behaviors by combining the robot basic functionality

Motor

/qt_robot/motor/...

robot motor controls using standard ros_control

Setting

/qt_robot/setting/...

basic setting of robot such as speaker volume control.


Speech Interface

This interface implements QTrobot text to speech functionality which support many languages. The supported languages can be different for each robot. following are some standard supported languages:

  • en-US (american english)
  • en-GB (british english)
  • fr-FR (French)
  • de-DE (German)

Subscribers

Interface Name

Data Type

Description

/qt_robot/speech/say

'std_msgs/String' (text)

Read a text using built-in TTS

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/speech/say

'speech_say'

message

Read a text using built-in TTS

/qt_robot/speech/config

'speech_config'

'language', 'pitch', 'speed'

Configure TTS

(!) Default language is set in '/opt/ros/kinetic/share/qt_robot_interface/config/qtrobot-interface.yaml'. the default pitch is usually '140' and speed is '80'.

(!) when calling 'speech_config', leave pitch and speed parameters to '0' if you do not want to change them.


Audio Interface

Play a standard audio file (e.g. wav, mp3). The audio file can be given using its complete name (with the file extension) such as happy.mp3 or simply without the file extension: happy. In the second case, the player first looks for happy.mp3 to play and if it is not found it then tries with happy.wavand so on.

(!) The default path for the audio files is '~/robot/data/audios/'

Subscribers

Interface Name

Data Type

Description

/qt_robot/audio/play

'std_msgs/String' (audio file name)

Play an sudio file

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/audio/play

'audio_play'

'filename', 'filepath'

Play an audio file given by its filename and filepath

(!) To play the audio file from the default path, pass an empty string to filepath param.


Emotion Interface

Change the robot facial emotions such as 'ava_happy', 'ava_sad', etc.

Subscribers

Interface Name

Data Type

Description

/qt_robot/emotion/show

'std_msgs/String' (emotion name)

Show a facial emotion given by its name

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/emotion/show

'emotion_show'

'name'

Show a facial emotion given by its name

(!) The complete list of emotion files can be found in '~/robot/data/emotions/'.


Gesture Interface

Plays recored robot gesture (arms, head) such as 'happy', 'discust', etc. The complete list of gesture files can be found in ~/robot/data/gestures/.

Subscribers

Interface Name

Data Type

Description

/qt_robot/gesture/play

'std_msgs/String' (gesture name)

Play a robot gesture given by its name

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/gesture/play

'gesture_play'

'name', 'speed'

Play a robot gesture given by its name and speed (default 1.0)

/qt_robot/gesture/record

'gesture_record'

'parts', 'idleParts'

start recording a new gesture. 'parts' is a string array of parts name ('head', 'left_arm', 'right_arm') which specifies which robot part will be used for recording the gesture. 'idleParts' must be set to 'true' to release the motor PWM and put them in idle mode. If not you must put them in idle mode using '/qt_robot/motors/setControlMode' interface

/qt_robot/gesture/save

'gesture_save'

'name', 'path'

stops the current recording process and save the recorded gesture given by its 'name'. 'path' specifies where to save the gesture instead of the default path.

/qt_robot/gesture/list

'gesture_list'

<none>

return a list of a the available gestures within the default gesture path

(!) The default value for speed is 1.0 and it is the default speed with which the gesture got recorded.

/!\ Notice: if the speed param value is 0 the default speed will be used to play the gestures.

/!\ Notice: Default path to record/play the gesture is '~/robot/data/gestures/'.


Behavior Interface

This interface implements higher-level and more complex behaviors by combing robot basic functionality.

Subscribers

Interface Name

Data Type

Description

/qt_robot/behavior/talkText

'std_msgs/String' (message)

Read a text using TTS and show talking emotion

/qt_robot/behavior/talkAudio

'std_msgs/String' (audio filename)

Play an audio file and show talking emotion

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/behavior/talkText

'behavior_talk_text'

'message'

Read a text using TTS and show talking emotion

/qt_robot/behavior/talkAudio

'behavior_talk_audio'

'filename', 'filepath'

Play an audio file and show talking emotion

/!\ Notice: The talkAudio and talkText services are mutually exclusive and cannot be used with Emotion Interface at the same time.

(!) To play the audio file from the default path, pass an empty string to filepath param.


Motor Interface

Motor interface provide access to the robot actuators using standard ros_control system. Currently the interface implements ROS 'JointStateController', 'JointGroupPositionController' and a custom 'QTMotorsController' controllers.

/!\ Notice: Before using the Motor interface, ensure that you fully understood ros_control system and have a clear understanding of what you do at the motor joint level.

QTrobot parts

The robot joints are structured as different parts as shown bellow:

Subscribers

Interface Name

Data Type

Description

/qt_robot/joints/state

sensor_msgs/JointState

publishes joint states (currently only positions)

/qt_robot/head_position/command

std_msgs/Float64MultiArray

move the robot head to desired position given by (HeadYaw, HeadPitch).

/qt_robot/right_arm_position/command

std_msgs/Float64MultiArray

move the right_arm to desired position given by (RightShoulderPitch, RightShoulderRoll, RightElbowRoll).

/qt_robot/left_arm_position/command

std_msgs/Float64MultiArray

move the left_arm to desired position given by (LeftShoulderPitch, LeftShoulderRoll, LeftElbowRoll).

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/motors/home

'home'

'parts'

moves the desired robot part to the home position. 'parts' is an array of robot parts and/or single joint name (e.g. ['left_arm', 'right_arm', 'HeadPitch'])

/qt_robot/motors/setControlMode

'set_control_mode'

'parts'

set the control mode (M_ON=0, M_OFF=1, M_BRAKE=2) of desired robot part. 'parts' is an array of robot parts and/or single joint name (e.g. ['left_arm', 'right_arm', 'HeadPitch']). M_ON: motor is controlled. M_OFF: motor is idle and M_BRAKE: motor is in brake mode (not controlled)

/qt_robot/motors/setVelocity

'set_velocity'

'parts', 'velocity'

sets the moving velocity of the desired robot part. 'parts' is an array of robot parts and/or single joint name (e.g. ['left_arm', 'right_arm', 'HeadPitch']). 'velocity' is given as percentage.

/!\ Notice: For safety purpose, every joint has a maximum velocity limits. For example you cannot run 'HeadPitch' joint with more than 20% of the maximum velocity.


Setting Interface

This interface provides some basic setting of robot such as speaker volume control.

Services

Interface Name

Service Name

Parameters

Description

/qt_robot/setting/setVolume

'setting_setVolume'

'volume'

set the robot speaker volume to the desired level (in percentage)

Examples

Some examples to use QTrobot interface.


Reading joint positions

   1 import rospy
   2 from sensor_msgs.msg import JointState
   3 
   4 def joint_states_callback(msg):
   5     strmsg = ""
   6     for i, joint_name in enumerate(msg.name):
   7         strmsg += "%s: %.2f, " % (joint_name, msg.position[i])
   8     rospy.loginfo(strmsg)
   9 
  10 if __name__ == '__main__':
  11     rospy.init_node('joints_example')
  12     rospy.Subscriber('/qt_robot/joints/state', JointState, joint_states_callback)
  13 
  14     rospy.spin()


Commanding motors

   1 import rospy
   2 from std_msgs.msg import Float64MultiArray
   3 
   4 if __name__ == '__main__':
   5     rospy.init_node('joints_example')
   6     right_pub = rospy.Publisher('/qt_robot/right_arm_position/command', Float64MultiArray, queue_size=1)
   7     ref = Float64MultiArray()
   8     RightShoulderPitch = 0
   9     RightShoulderRoll = 0
  10     RightElbowRoll = -10
  11     ref.data = [RightShoulderPitch ,RightShoulderRoll ,RightElbowRoll]
  12     right_pub.publish(ref)
  13 
  14     rospy.spin()


Using speech, audio emotion, ... interfaces

The python example demonstrates how to use each interfaces using ROS publish/subscribe and service/client calls.

   1 #!/usr/bin/env python
   2 import sys
   3 import rospy
   4 from std_msgs.msg import String
   5 from qt_robot_interface.srv import *
   6 from qt_gesture_controller.srv import *
   7 
   8 # the following activities will run in parallel on the robot
   9 # with no execution order
  10 def publishAllCuncurent():
  11     audioPlay_pub.publish("Qt2.wav")
  12     speechSay_pub.publish("Hello! This is QT talking using text to speech")
  13     gesturePlay_pub.publish("happy")
  14     emotionShow_pub.publish("ava_happy")
  15     behaviorTalkText_pub.publish("I am QT robot! ")
  16     behaviorTalkAudio_pub.publish("Qt3.wav")
  17 
  18 # the following activities will run in sequence on the robot
  19 # one after another
  20 def callAllSequence():
  21     audioPlay("Qt2.wav", "")
  22     speechSay("Hello! This is QT talking using text to speech")
  23     gesturePlay("happy", 0)
  24     emotionShow("ava_happy")
  25     behaviorTalkText("I am QT robot!")
  26     behaviorTalkAudio("Qt3.wav", "");
  27 
  28 
  29 if __name__ == '__main__':
  30     rospy.init_node('python_qt_example')
  31 
  32     # create a publisher
  33     speechSay_pub = rospy.Publisher('/qt_robot/speech/say', String, queue_size=10)
  34     audioPlay_pub = rospy.Publisher('/qt_robot/audio/play', String, queue_size=10)
  35     emotionShow_pub = rospy.Publisher('/qt_robot/emotion/show', String, queue_size=10)
  36     gesturePlay_pub = rospy.Publisher('/qt_robot/gesture/play', String, queue_size=10)
  37     behaviorTalkText_pub = rospy.Publisher('/qt_robot/behavior/talkText', String, queue_size=10)
  38     behaviorTalkAudio_pub = rospy.Publisher('/qt_robot/behavior/talkAudio', String, queue_size=10)
  39 
  40     # wait for publisher/subscriber connections
  41     wtime_begin = rospy.get_time()
  42     while (speechSay_pub.get_num_connections() == 0 or
  43            audioPlay_pub.get_num_connections() == 0 or
  44            emotionShow_pub.get_num_connections() == 0 or
  45            gesturePlay_pub.get_num_connections() == 0 or
  46            behaviorTalkText_pub.get_num_connections() == 0 or
  47            behaviorTalkAudio_pub.get_num_connections() == 0 ) :
  48 
  49         rospy.loginfo("waiting for subscriber connections")
  50         if rospy.get_time() - wtime_begin > 5.0:
  51             rospy.logerr("Timeout while waiting for subscribers connection!")
  52             sys.exit()
  53         rospy.sleep(1)
  54 
  55     # create some service clients
  56     audioPlay = rospy.ServiceProxy('/qt_robot/audio/play', audio_play)
  57     speechSay = rospy.ServiceProxy('/qt_robot/speech/say', speech_say)
  58     gesturePlay = rospy.ServiceProxy('/qt_robot/gesture/play', gesture_play)
  59     emotionShow = rospy.ServiceProxy('/qt_robot/emotion/show', emotion_show)
  60     behaviorTalkText = rospy.ServiceProxy('/qt_robot/behavior/talkText', behavior_talk_text)
  61     behaviorTalkAudio = rospy.ServiceProxy('/qt_robot/behavior/talkAudio', behavior_talk_audio)
  62 
  63     # wait for some services and connection to subscribers
  64     rospy.loginfo("waiting for services connections")
  65     rospy.wait_for_service('/qt_robot/gesture/play')
  66     rospy.wait_for_service('/qt_robot/emotion/show')
  67     rospy.wait_for_service('/qt_robot/audio/play')
  68     rospy.wait_for_service('/qt_robot/speech/say')
  69     rospy.wait_for_service('/qt_robot/behavior/talkText')
  70     rospy.wait_for_service('/qt_robot/behavior/talkAudio')
  71 
  72     rospy.loginfo("ready...")
  73     try:
  74         callAllSequence()
  75         publishAllCuncurent()
  76         # rospy.spin()
  77     except rospy.ROSInterruptException:
  78         pass

Wiki: Robots/qtrobot (last edited 2018-12-14 12:08:45 by apaikan)