-
Notifications
You must be signed in to change notification settings - Fork 1
Home
This page describes how to get started programming your Nao robot with the Human Robot Interaction API (hri_api).
-
Tutorials
1.1. Starting hri_api
1.2. Programing Nao to speak, gaze and gesture (Python)
1.3. Making Nao speak, gaze and gesture at people simultaneously (Python)
1.4. Querying the World for objects Nao can sense (Python)
1.5. Listening to what people say (Python)
These tutorials describe how to get started programming human-robot interaction for the Nao humanoid robot. Make sure that Nao is on, connected to the network and in a stable standing pose. All of the examples described in this tutorial are located in the folder nao_hri/scripts.
A number of background services need to be started before you can run your human-robot interaction programs on Nao. First, open a terminal terminal window and start the ros master service:
user@pc:~$ roscore
... logging to /home/user/.ros/log/3a034616-42a2-11e4-969d-5
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
started roslaunch server http://pc:54570/
ros_comm version 1.11.9
SUMMARY
========
PARAMETERS
* /rosdistro: indigo
* /rosversion: 1.11.9
...
Then, in a new terminal tab, start the hri_api background services:
user@pc:~$ roslaunch nao_hri nao_interaction.launch
... logging to /home/user/.ros/log/d29dcda4-42ae-11e4-b129-b8ca3a81
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
SUMMARY
========
PARAMETERS
* /armature_name: Armature
* /blender_target_controllers: /home/user/catki...
* /launch_blender/blend_file: /home/user/catki...
* /launch_blender/python_script: /home/user/catki...
* /launch_blender/use_game_engine: False
* /nao_gaze_action_server/action_server_name: gaze
* /nao_gaze_action_server/axes: yz
...
Now that the background services have been started, lets learn how to make Nao speak.
This is an example script that makes Nao speak. You can also find it in the nao_hri package under: nao_hri/scripts/hri_say_examples.py
1 #!/usr/bin/env python
2 # license removed for brevity
3 from nao_hri import Nao
4
5 robot = Nao()
6 robot.say_and_wait("Hello")
7 robot.say_and_wait("I'm a crazy robot")
Now, let's break the code down.
1 #!/usr/bin/env python
Every hri_api script must have this declaration at the top. The first line makes sure your script is executed as a Python script.
3 from nao_hri import Nao
The nao_hri import is so that we can use the Nao robot class to program Nao.
5 robot = Nao()
This line defines a Nao robot instance.
6 robot.say_and_wait("Hello")
7 robot.say_and_wait("I'm a crazy robot")
This section of code makes Nao say "Hello" and then "I'm a crazy robot". The function robot.say_and_wait(text) instructs the robot to speak the text contained in the parameter text. This function returns when the robot has finished speaking.
Now that we've made Nao speak, lets learn to make him gesture.
This is an example script that makes Nao gesture. You can also find it in the nao_hri package under: nao_hri/scripts/hri_gesture_examples.py
1 #!/usr/bin/env python
2 # license removed for brevity
3 from nao_hri import Nao, Gesture
4
5 robot = Nao()
6
7 robot.gesture_and_wait(Gesture.HandsOnHips)
8 robot.gesture_and_wait(Gesture.MotionLeft)
9 robot.gesture_and_wait(Gesture.MotionRight)
10 robot.gesture_and_wait(Gesture.WaveLarm, duration=10.0)
11 robot.gesture_and_wait(Gesture.LarmDown)
12 robot.gesture_and_wait(Gesture.RarmDown)
Now, let's break the code down.
1 #!/usr/bin/env python
As mentioned previously, every hri_api script must have this declaration at the top. The first line makes sure your script is executed as a Python script.
3 from nao_hri import Nao, Gesture
This time, as well as importing a Nao class, we have also imported a Gesture class. This is an enum that contains a list of all of the gestures Nao can perform.
5 robot = Nao()
This line defines a Nao robot instance.
7 robot.gesture_and_wait(Gesture.HandsOnHips)
8 robot.gesture_and_wait(Gesture.MotionLeft)
9 robot.gesture_and_wait(Gesture.MotionRight)
This section makes Nao put his hands on his hips, then motion his left hand to the left and then motion his right hand to the right. The robot.gesture_and_wait(gesture) function enables this, it takes a Gesture enum as a parameter which details the type of gesture to perform.
10 robot.gesture_and_wait(Gesture.WaveLarm, duration=10.0)
This line makes Nao wave using his left arm for 10 seconds. In the previous examples, no duration was specified, so the gesture's were performed for their default durations. The default durations are specified in: nao_hri/src/nao_hri/nao.py
11 robot.gesture_and_wait(Gesture.LarmDown)
12 robot.gesture_and_wait(Gesture.RarmDown)
This section makes Nao put both of his arms back down by his side again.
We now learn how to make Nao gaze at things, in particular parts of a peoples bodies.
This is an example script that makes Nao gaze at different parts of a peoples bodies. You can also find it in the nao_hri package under: nao_hri/scripts/hri_gaze_examples.py
1 #!/usr/bin/env python
2 # license removed for brevity
3
4 from hri_api.entities import Person
5 from nao_hri import Nao
6
7 robot = Nao()
8
9 person1 = Person(1)
10 person2 = Person(2)
11 person3 = Person(3)
12
13 robot.gaze_and_wait(person1.head)
14 robot.gaze_and_wait(person2.torso)
15 robot.gaze_and_wait(person3.left_hand, speed=0.8)
Now, let's break the code down.
4 from hri_api.entities import Person
5 from nao_hri import Nao
To make Nao interact with people, we need to import a Person class from the hri_api.entities module.
7 robot = Nao()
As usual we create a Nao robot instance.
9 person1 = Person(1)
10 person2 = Person(2)
11 person3 = Person(3)
This snippet creates three instances of the Person class. In section 1.1 when you ran the command roslaunch nao_hri nao_interaction.launch
, it started ROS nodes that broadcast simulated coordinates for these three people.
13 robot.gaze_and_wait(person1.head)
14 robot.gaze_and_wait(person2.torso)
15 robot.gaze_and_wait(person3.left_hand, speed=0.8)
We now make Nao gaze at the three people with the robot.gaze_and_wait(target) function. This function returns when the robot has succeeded gazing at the specified target. In this example, Nao first gazes at person1's head, then at person2's torso and lastly at person3's left hand.
As can be seen on line 15, you can also control the speed that the robot gazes at via the speed parameter. This is a normalized value between the range: 0.0 < speed <= 1.0.
In this section we learn how to make Nao speak and gaze at people simultaneously. It is particulary useful if you have long sentences for the robot to speak.
This is an example script that makes Nao speak and gaze at a group of people simultaneously. You can also find it in the nao_hri package under: nao_hri/scripts/hri_say_to_examples.py
1 #!/usr/bin/env python
2 # license removed for brevity
3
4 from hri_api.entities import Person
5 from nao_hri import Nao
6
7 robot = Nao()
8 person = Person(1)
9 people = [person, Person(2), Person(3)]
10
11 robot.say_to_and_wait('Hello who are you?', person)
12 robot.say_to_and_wait('We choose to go to the moon in this decade, '
13 'not because its easy, but because its hard, '
14 'because that goal will serve to organize and '
15 'measure the best of our energies and skills, '
16 'because that challenge is one that we are '
17 'willing to accept, one we are unwilling to
18 'postpone, and one which we intend to win.', people)
Now, let's break the code down.