Skip to content
jdddog edited this page Sep 28, 2014 · 74 revisions

Programming Nao with the Human Robot Interaction API

This page describes how to get started programming your Nao robot with the Human Robot Interaction API (hri_api).


Contents

1. Tutorials

1.1. Starting hri_api

1.2. Making Nao speak (Python)

1.3. Making Nao gesture (Python)

1.4. Making Nao gaze at people (Python)

1.5. Making Nao speak and gaze at a group of people simultaneously (Python)

1.6. Querying the World for objects Nao can sense (Python)

1.7. Listening to what people say (Python)


1. Tutorials

These tutorials describe how to get started programming human-robot interaction for the Nao humanoid robot. Make sure that Nao is on, connected to the network and in a stable standing pose. All of the examples described in this tutorial are located in the folder nao_hri/scripts.

1.1. Starting hri api

A number of background services need to be started before you can run your human-robot interaction programs on Nao. First, open a terminal terminal window and start the ros master service:

user@pc:~$ roscore
... logging to /home/user/.ros/log/3a034616-42a2-11e4-969d-5
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://pc:54570/
ros_comm version 1.11.9


SUMMARY
========

PARAMETERS
 * /rosdistro: indigo
 * /rosversion: 1.11.9

...

Then, in a new terminal tab, start the hri_api background services:

user@pc:~$ roslaunch nao_hri nao_interaction_sim.launch
... logging to /home/user/.ros/log/d29dcda4-42ae-11e4-b129-b8ca3a81
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

SUMMARY
========

PARAMETERS
 * /armature_name: Armature
 * /blender_target_controllers: /home/user/catki...
 * /launch_blender/blend_file: /home/user/catki...
 * /launch_blender/python_script: /home/user/catki...
 * /launch_blender/use_game_engine: False
 * /nao_gaze_action_server/action_server_name: gaze
 * /nao_gaze_action_server/axes: yz
...

1.2. Making Nao speak (Python)

Now that the background services have been started, lets learn how to make Nao speak.

1.2.1 The Code

This is an example script that makes Nao speak. You can also find it in the nao_hri package under: nao_hri/scripts/hri_say_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3 from nao_hri import Nao; import time
 4  
 5 robot = Nao()
 7
 7 ah1 = robot.say("Hello")
 8 robot.wait(ah1) 
 9
10 robot.say_and_wait("I'm a crazy robot")
11 
12 ah2 = robot.say(“A really annoying robot”)
13 time.sleep(1)
14 robot.cancel(ah2)

1.2.2 The Code Explained

Now, let's break the code down.

1 #!/usr/bin/env python

Every hri_api script must have this declaration at the top. The first line makes sure your script is executed as a Python script.

3 from nao_hri import Nao

The nao_hri import is so that we can use the Nao robot class to program Nao.

5 robot = Nao()

This line defines a Nao robot instance.

 7 ah1 = robot.say("Hello")
 8 robot.wait(ah1) 

This section of code makes Nao say "Hello” and then wait until it has finished speaking. The robot.say(text) function is what makes Nao speak. It is asynchronous and returns an action handle to uniquely identify the speaking action being performed. In line 7, we use the robot.wait(*action_handle) function to wait until Nao has finished speaking. The functions sans “and_wait” are useful to run multiple actions on the robot in parallel.

10 robot.say_and_wait("I'm a crazy robot")

This line makes Nao say “I’m a crazy robot”. The difference between the previous method of making Nao speak is that it blocks until the sentence has finished.

12 ah2 = robot.say(“A really annoying robot”)
13 time.sleep(1)
14 robot.cancel(ah2)

This section of code demonstrates how to cancel actions that the robot is performing. You simply pass the action handle returned by an asynchronous function to the robot.cancel(*action_handles) function, which imediatetly stops the robot performing the action tied to the action handle.

1.3. Making Nao gesture (Python)

Now that we've made Nao speak, lets learn to make him gesture.

1.3.1 The Code

This is an example script that makes Nao gesture. You can also find it in the nao_hri package under: nao_hri/scripts/hri_gesture_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3 from nao_hri import Nao, Gesture; import time
 4 
 5 robot = Nao()
 6
 7 robot.gesture_and_wait(Gesture.HandsOnHips)
 8
 9 g1 = robot.gesture(Gesture.MotionLeft)
10 g2 = robot.gesture(Gesture.MotionRight)
11 robot.wait(g1, g2)
12
13 robot.gesture_and_wait(Gesture.WaveLarm, duration=10.0)
14 
15 g3 = robot.gesture(Gesture.LarmDown)
16 g4 = robot.gesture(Gesture.RarmDown)
17 time.sleep(1)
18 robot.cancel(g3, g4)

1.3.2 The Code Explained

Now, let's break the code down.

1 #!/usr/bin/env python

As mentioned previously, every hri_api script must have this declaration at the top. The first line makes sure your script is executed as a Python script.

3 from nao_hri import Nao, Gesture

This time, as well as importing a Nao class, we have also imported a Gesture class. This is an enum that contains a list of all of the gestures Nao can perform.

5 robot = Nao()

This line defines a Nao robot instance.

 7 robot.gesture_and_wait(Gesture.HandsOnHips)

This line makes Nao put his hands on his hips and wait until it has finished. This is enabled with the robot.gesture_and_wait(gesture) function, which takes a Gesture enum that specifies the type of gesture to perform.

 9 g1 = robot.gesture(Gesture.MotionLeft)
10 g2 = robot.gesture(Gesture.MotionRight)
11 robot.wait(g1, g2)

This section makes Nao put both of his arms up either side at the same time; using a combination of the asynchronous robot.gesture(gesture) and robot.wait(*action_handles) functions.

13 robot.gesture_and_wait(Gesture.WaveLarm, duration=10.0)

This line makes Nao wave using his left arm for 10 seconds (specified with the duration parameter). In the previous examples, no duration was specified, so the gesture's were performed with their default durations. The default durations are specified in: nao_hri/src/nao_hri/nao.py

15 g3 = robot.gesture(Gesture.LarmDown)
16 g4 = robot.gesture(Gesture.RarmDown)
17 time.sleep(1)
18 robot.cancel(g3, g4)

Lastly, this section shows how to cancel gestures. In this example Nao begins to put his arms down by his side, then after waiting for a second we cancel the two running gestures with the robot.cancel(*action_handles) function.

1.4. Making Nao gaze at people (Python)

We now learn how to make Nao gaze at things, in particular parts of a peoples bodies.

1.4.1 The Code

This is an example script that makes Nao gaze at different parts of a peoples bodies. You can also find it in the nao_hri package under: nao_hri/scripts/hri_gaze_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3
 4 from hri_api.entities import Person
 5 from nao_hri import Nao
 6
 7 robot = Nao()
 8 
 9 person1 = Person(1)
10 person2 = Person(2)
11 person3 = Person(3)
12
13 g1 = robot.gaze(person1.head)
14 robot.wait(g1)
15
16 robot.gaze_and_wait(person2.torso)
17 robot.gaze_and_wait(person3.head, speed=0.8)
18 
19 g2 = robot.gaze(person1.head)
20 time.sleep(1)
21 robot.cancel(g2)

1.4.2 The Code Explained

Now, let's break the code down.

 4 from hri_api.entities import Person
 5 from nao_hri import Nao

To make Nao interact with people, we need to import a Person class from the hri_api.entities module.

 7 robot = Nao()

As usual we create a Nao robot instance.

 9 person1 = Person(1)
10 person2 = Person(2)
11 person3 = Person(3)

This snippet creates three instances of the Person class. In section 1.1 when you ran the command roslaunch nao_hri nao_interaction_sim.launch, it started ROS nodes that broadcast simulated coordinates for these three people.

13 g1 = robot.gaze(person1.head)
14 robot.wait(g1)

We now make Nao gaze at the first persons head with the robot.gaze_and_wait(target) function.

This function returns when the robot has succeeded gazing at the specified target. In this example, Nao first gazes at person1's head, then at person2's torso and lastly at person3's left hand.

As can be seen on line 15, you can also control the speed that the robot gazes at via the speed parameter. This is a normalized value between the range: 0.0 < speed <= 1.0.

1.5. Making Nao speak and gaze at a group of people simultaneously (Python)

In this section we learn how to make Nao speak and gaze at people simultaneously. It is particulary useful if you have long sentences for the robot to speak.

1.5.1 The Code

This is an example script that makes Nao speak and gaze at a group of people simultaneously. You can also find it in the nao_hri package under: nao_hri/scripts/hri_say_to_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3
 4 from hri_api.entities import Person
 5 from nao_hri import Nao
 6 import time
 7
 8 robot = Nao()
 9 person1 = Person(1)
10 people = [person1, Person(2), Person(3)]
11 
12 s1 = robot.say_to('Hello who are you?', person1)
13 robot.wait(s1)
14
15 robot.say_to_and_wait('We choose to go to the moon in this decade, ' 
16                       'not because its easy, but because its hard, '
17                       'because that goal will serve to organize and '
18                       'measure the best of our energies and skills, '
19                      'because that challenge is one that we are '
20                      'willing to accept, one we are unwilling to '
21                       'postpone, and one which we intend to win.', people)
22
23 s2 = robot.say_to(‘hellooooooo, did you like my speech?’, person1)
24 time.sleep(1)
25 robot.cancel(s2)

1.5.2 The Code Explained

Now, let's break the code down.

 8 robot = Nao()
 9 person1 = Person(1)
10 people = [person1, Person(2), Person(3)]

As usual we create an instance of the Nao robot class. We then create a Person and a list of people.

12 s1 = robot.say_to('Hello who are you?', person1)
13 robot.wait(s1)

The say_to function is used to make the robot gaze and speak to a person or a group of people. When there is one person in the audience, the robot first gazes at that person (e.g. person1); when it has made eye contact it speaks (e.g. s’aying Hello who are you?' in this example). This particular example is done through a combination the asynchronous function robot.say_to(text, audience) and the blocking robot.wait(*action_handles) function.

15 robot.say_to_and_wait('We choose to go to the moon in this decade, ' 
16                       'not because its easy, but because its hard, '
17                       'because that goal will serve to organize and '
18                       'measure the best of our energies and skills, '
19                      'because that challenge is one that we are '
20                      'willing to accept, one we are unwilling to '
21                       'postpone, and one which we intend to win.', people)

Lines 15 - 16 illustrate a slightly more complex example, which is where we want to make the robot speak to not just one person, but a group of people. This is also accomplished with the robot.say_to(text, audience) or robot.say_to_and_wait(text, audience) functions. The key variable is the audience parameter of the say_to functions, which can be a single Person object, a list of Person objects or a Query that contains Person objects (explained later on). In this case, Nao will initially make eye contact with the person in the people list that is closest to itself. Once Nao has made eye contact with that person, it will start speaking. Whenever Nao reaches a new sentence (denoted by punctuation: ,.?!), it chooses a random person from the list and gazes at them.

1.6. Querying the World for objects Nao can sense (Python)

In the previous examples, we created the Person objects that Nao interacted with. This is perfect for testing human-robot interaction scripts, but when real people interact with Nao, the Person objects and their coordinates need to be created dynamically via the robots perception systems. The following examples assume that you run two launch files, which are different to what we used in the previous examples:

user@pc:~$ roslaunch nao_hri nao_interaction.launch
... logging to /home/user/.ros/log/d29dcda4-42ae-11e4-b129-b8ca3a81
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

SUMMARY
========

PARAMETERS
 * /armature_name: Armature
 * /blender_target_controllers: /home/user/catki...
 * /launch_blender/blend_file: /home/user/catki...
 * /launch_blender/python_script: /home/user/catki...
 * /launch_blender/use_game_engine: False
 * /nao_gaze_action_server/action_server_name: gaze
 * /nao_gaze_action_server/axes: yz
...
user@pc:~$ roslaunch nao_hri nao_sensors.launch

1.6.1 The Code

This is an example script that blocks until at least one person is detected. Then we find the person closest to Nao and make Nao greet that person and wave at them. You can also find it in the nao_hri package under: nao_hri/scripts/hri_perception_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3 import rospy
 4 from hri_api.entities import Person, World
 5 from hri_api.query import Query
 6 from nao_hri import Nao, Gesture
 7
 8 world = World()
 9 robot = Nao()
10 people = Query(world).select_type(Person)
11 closest_person = people.sort_decreasing(lambda p: p.distance_to(robot)).take(1)
12 
13 rate = rospy.Rate(2)
14 while not rospy.is_shutdown():
15     result = closest_person.execute()
16
17     if len(result) > 0:
18         break
19     rate.sleep()
20
21 ah1 = robot.say_to('Hello I can see you!', closest_person)
22 ah2 = robot.gesture(Gesture.WaveLarm)
23 robot.wait(ah1, ah2)

1.6.2 The Code Explained

Now, let's break the code down.

 3 import rospy
 4 from hri_api.entities import Person, World
 5 from hri_api.query import Query
 6 from nao_hri import Nao, Gesture

In this example we import several modules and classes we haven't seen yet, including: rospy, which contains Python helper classes and functions for ROS; the World class, which contains the objects detected in the robots environment and the Query class, which is used to filter objects in the world.

 8 world = World()

This line creates a World object.

10 people = Query(world).select_type(Person)

This line is the first example of how to write a Query. Queries are similar in concept to Microsoft's LINQ and are a fork of the library Python ASQ. Queries are created by chaining methods together to create complex yet readable queries.

This particular Query selects all of the Person objects from the world. Queries are initialised with the Query(iterable) constructor, which takes an iterable as input(e.g. a list). We then call the method .select_type(cls) which selects objects that are instances of a particular class, in our case, Person objects. Queries use deferred execution, i.e., they are only executed when the .execute() method is called.

11 closest_person = people.sort_decreasing(lambda p: p.distance_to(robot)).take(1)

The next query builds on the query from the previous line by sorting all of the people objects by the distance of each person to the robot. This is accomplished with the .sort_decreasing(key) function, which sorts the objects in decreasing order. The order of the items is determined by the key parameter, an anonymous inline function (lambda). The lambda function we use to order the people objects is lambda p: p.distance_to(robot), it takes a person object as input (p) and returns the distance of that person to the robot (lambda p: p.distance_to(robot)). The values returned by these function calls are used as the keys for ordering the list of people. The last part of the query is the .take(n) function, which takes the first n items from the resulting query, in this case 1 item (.take(1)).

13 rate = rospy.Rate(2)
14 while not rospy.is_shutdown():
15     result = closest_person.execute()
16
17     if len(result) > 0:
18         break
19     rate.sleep()

This section of code loops at a rate of 2 times a second, executes the closest_person query (line 15) and breaks if the closet_person query returns 1 or more items (lines 17-18). In other words, we continue when 1 or more people are detected by Nao.

21 ah1 = robot.say_to('Hello I can see you!', closest_person)
22 ah2 = robot.gesture(Gesture.WaveLarm)
23 robot.wait(ah1, ah2)

Finally, we make Nao gaze and say 'Hello I can see you!' to the closet_person via the .say_to(text, audience) function and also make Nao wave its left arm simultaneously with the robot.gesture(gesture) function. We wait for both actions to finish with the robot.wait(*action_handles) function.

1.7 Listening to what people say (Python)

In this last example we look at how to listen to the things people say when they talk to Nao.

1.7.1 The Code

This is an example script that makes Nao say back to you what he just heard. You can also find it in the nao_hri package under: nao_hri/scripts/hri_listen_examples.py

 1 #!/usr/bin/env python
 2 # license removed for brevity
 3 from nao_hri import Nao
 4 import rospy
 5 from threading import RLock
 6
 7 lock = RLock()
 8 robot = Nao()
 9 
10 def i_heard_that(speech):
11     with lock:
12         if speech is not None:
13               robot.say_and_wait(speech)
14           else:
15               robot.say_and_wait("I didn't hear you speak up!")
16
17 robot.register_listen_callback(i_heard_that)
18 rospy.spin()

1.7.2 The Code Explained

Clone this wiki locally