-
Notifications
You must be signed in to change notification settings - Fork 2
Part 2
- Exercise 1: Interpreting Odometry data
- Exercise 2: Processing Odometry data
- Exercise 3: Moving a robot from the command line
- Exercise 4: Creating a velocity controller
- Exercise 5: "Closed-Loop Control"
If it isn't currently running then launch your WSL-ROS environment using the WSL-ROS shortcut in the Windows Start Menu. Once ready this will open up the Windows Terminal and an Ubuntu terminal instance (which we'll refer to as TERMINAL 1).
If you happen to have changed to a different university machine since Part 1 then you may wish to restore the work that you did in the earlier session. You should have run the rosbackup.sh
to backup all your work before, so you should now be able to restore this by running the following command in TERMINAL 1:
[TERMINAL 1] $ rosrestore.sh
-
In the terminal enter the following command to launch a simulation of a TurtleBot3 Waffle in an empty world:
[TERMINAL 1] $ roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch
A Gazebo simulation window should open and within this you should see a TurtleBot3 Waffle in empty space:
Recall from the Introducing the Robots Section of this Wiki that the TurtleBot3 Waffles that we are working with here have the following sensors and actuators on-board to allow them to navigate:
- Two independently controlled wheel motors (a differential drive configuration)
- An Inertial Measurement Unit (IMU) to detect motion & orientation
- A 360° laser displacement sensor (LiDAR) to detect its environment
Two types of Velocity Command can be issued to any ROS Robot to make it move:
- Linear Velocity: The velocity at which the robot moves forwards or backwards in one of its axes
- Angular Velocity: The velocity at which the robot rotates about one of its axes
The TurtleBot3's principal axes are defined as follows:
The TurtleBot3 robot has a differential drive configuration, so it can only move linearly in the x axis. In order to move to the left or right, it must first rotate to face the desired direction before moving forward. In addition to this, the robot can only rotate about its z (yaw) axis.
It's also worth noting that the robot has the following maximum velocity limits:
- A maximum linear velocity of 0.26 m/s,
- A maximum angular velocity of 1.82 rad/s.
In the previous session you learnt how to list all the topics that are currently active on a ROS system. Open up a new terminal instance (TERMINAL 2) and use what you learnt here to list all of the topics that are active on your ROS system now, as a result of you launching the Gazebo simulation in the step above.
Which topic in the list do you think could be used to control the velocity of the robot? (refer back to this exercise in Part 1 for a hint!) Use the rostopic info
command on the topic to find out more about it.
The topic you have identified should use a message of the geometry_msgs/Twist
type. You will have to send messages of this type to this topic in order to make the robot move. Use the rosmsg
command as you did earlier to find out more about the format of this message.
As we learnt above, the TurtleBot3 can only generate linear velocity in the x axis and angular velocity in the z axis. As a result, only velocity commands issued to the linear.x
or angular.z
parts of this ROS message will have any effect.
Another topic that should have appeared when you ran the rostopic list
command above is /odom
. This topic contains Odometry data, which is also essential for robot navigation and is a basic feedback signal, allowing a robot to approximate its location.
-
In TERMINAL 2 use the
rostopic echo
command to display the odometry data currently being published by our simulated robot:[TERMINAL 2] $ rostopic echo -c /odom
Expand the terminal window as necessary so that you can see the whole topic message (it starts with
header
and ends with---
). What does the-c
option in the command above actually do? -
Now, you need to launch a new terminal window, but as a new instance so that you can view it alongside TERMINAL 2. Click the Windows Start Menu button and start typing "windows terminal", then launch the app when it appears in the list. We'll call this one WT(B). Arrange both windows side-by-side so you can see what's happening in both.
-
In WT(B) launch the
keyboard_teleop
node as you did earlier:[WT(B)] $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
-
In WT(B) enter
A
a couple of times to make the robot rotate on the spot. Observe how the odometry data changes (in TERMINAL 2). Is there anything in thetwist
part of the/odom
message that corresponds to theangular vel
that you are setting in WT(B)? -
Now press the
S
key to halt the robot, then pressW
a couple of times to make the robot drive forwards. How does thetwist
part of the message now correspond to thelinear vel
setting in WT(B)? -
Now press
D
a couple of times and your robot should start to move in a circle. What linear and angular velocities are you requesting in WT(B) and how are these represented in thetwist
part of the/odom
message? What about thepose
part of the message? How is this data changing as your robot moves in a circular path, what do you think this tells you? -
Press
S
in WT(B) to halt the robot (but leave the keyboard teleop node running). Then, pressCtrl+C
in TERMINAL 2 to shutdown therostopic echo
node. -
Next, with the robot stationary, use
rosrun
to run a Python node that takes a snapshot of the robot's current odometry data:[TERMINAL 2] $ rosrun com2009_odometry_example robot_start_pose.py
Consider the output of this node and what this tells you about what the node is actually doing.
-
Now (using the keyboard teleop node in WT(B)), drive your robot back to the origin of its world (where the blue, green and red lines meet).
-
Now, from the window containing TERMINAL 2, open a new terminal window (TERMINAL 3) to run another Python node to take another snapshot of the robot's current odometry data. This node will now also compare this to the data obtained by the
robot_start_pose
node that was launched earlier:[TERMINAL 3] $ rosrun com2009_odometry_example robot_end_pose.py
The output of this node should provide you with a summary of how the robot's odometry has just changed in between running the two
com2009_odometry_example
nodes. Thestart
andend
columns provide a summary of the odometry data that was obtained before and after the robot was moved and thedelta
column shows the difference between the two. Which odometry parameters haven't changed, and is this as you would expect (considering the robot's principal axes as illustrated above)? -
Press
Ctrl+C
in TERMINAL 2 and WT(B), to stop therobot_start_pose
andturtlebot3_teleop
nodes and close down the WT(B) terminal instance.
We can learn more about Odometry data by using the rostopic info
command:
$ rostopic info /odom
This provides information about the type of message used on this topic:
Type: nav_msgs/Odometry
We can find out more about this using the rosmsg info
command:
$ rosmsg info nav_msgs/Odometry
Which tells us that the nav_msgs/Odometry
message contains four base elements:
- header
- child_frame_id
- pose
- twist
pose tells us the position and orientation of the robot relative to an arbitrary reference point (typically where the robot was when it was turned on). The pose is determined from:
- Data from the Inertial Measurement Unit (IMU) onboard the OpenCR board
- Data from both the left and right wheel encoders
- An estimation of the distance travelled by the robot from its pre-defined reference point (using dead-reckoning)
Position data is important for determining the movement of our robot, and from this we can estimate its location in 3-dimensional space.
Orientation is expressed in units of Quaternions, and needs to be converted into angles (in degrees) about the principal axes. Fortunately, there are functions within the ROS tf
library to do that for us, which we can use in any Python node as follows:
from tf.transformations import euler_from_quaternion
(roll, pitch, yaw) = euler_from_quaternion([orientation.x,
orientation.y, orientation.z, orientation.w],
'sxyz')
Our TurtleBot3 robot can only move in a 2D plane and so, actually, its pose can be fully represented by (x,y,θz)
, where x
and y
are the 2D coordinates of the robot in the X-Y
plane, and θz
is the angle of the robot about the z
(yaw) axis. You may have noticed this in the exercise above, where the linear_z
, theta_x
and theta_y
values in the delta
column should all have read 0.0
.
twist tells us the current linear and angular velocities of the robot, and this data comes directly from the wheel encoders.
All this data is defined in terms of the principal axes illustrated in the figure above.
Earlier you learnt how to create a package and build simple nodes in Python to publish and subscribe to messages on a topic. We'll expand on this now to develop an odometry subscriber.
-
Navigate to the
src
directory of theros_training
package that you created earlier:[TERMINAL 2] $ roscd ros_training/src
-
The
subscriber.py
code that you used earlier can be used as a template for creating an odometry subscriber now. First, create a new file in yoursrc
directory (~/catkin_ws/src/ros_training/src
) calledodom_subscriber.py
:[TERMINAL 2] $ touch odom_subscriber.py
-
In the same way as last time, make this file executable using the Linux
chmod
command. -
Launch Atom (
$ atm .
), open theodom_subscriber.py
file and copy the basic subscriber code. -
Now, edit the code to subscribe to and print out odometry data to the terminal:
- You will need to make sure that you are importing the correct message type at the start of your code so that you can work with the Odometry data. Be aware that the
Odometry
message is part of thenav_msgs
library. If you need help, have a look at this explainer. - Your Python node should convert the raw odometry data to a
(x,y,θz)
format using theeuler_from_quaternion
function from thetf.transformations
library (remember thatθz
is the same as Yaw). If you aren't sure how to do this, why not have a look at the source code for thecom2009_odometry_example
nodes that you used in Exercise 1. Remember that you can navigate to this package using theroscd
command, and then locate the source code contained with it (in thesrc
directory).
- You will need to make sure that you are importing the correct message type at the start of your code so that you can work with the Odometry data. Be aware that the
-
Launch your node using
rosrun
. Observe how the output of your node (the formatted odometry data) changes whilst you move the robot around again using theturtlebot3_teleop
node (do this in TERMINAL 3). -
Stop your
subscriber.py
node in TERMINAL 2 and theturtlebot3_teleop
node in TERMINAL 3 by enteringCtrl+C
in each of the terminals.
Note: Make sure that you have stopped the
turtlebot3_teleop
node running in TERMINAL 3 (by enteringCtrl+C
) before starting this exercise.
We can use the rostopic pub
command to publish data to a topic from within a terminal by using the command in the following way:
rostopic pub [topic_name] [message_type] [data]
As we discovered earlier, the /cmd_vel
topic is expecting linear and angular data, each with an x
, y
and z
component. We can get further help with formatting this message by using the autocomplete functionality within the terminal. Type the following into TERMINAL 3 (copying and pasting won't work):
[TERMINAL 3] rostopic pub /cmd_vel geometry_msgs/Twist[SPACE][TAB][TAB]
- Use this to help you enter velocity commands in the terminal. Enter values to make the robot rotate on the spot. Make a note of the command that you used.
- Enter
Ctrl+C
in TERMINAL 3 to stop the message from being published. - Next, enter a command in TERMINAL 3 to make the robot move in a circle. Again, make a note of the command that you used.
- Enter
Ctrl+C
in TERMINAL 3 to again stop the message from being published. - Finally, enter a command to stop the TurtleBot and make a note of this too.
- Enter
Ctrl+C
in TERMINAL 3 to stop this final message from being published.
You will now create another node to control the motion of your TurtleBot3 by publishing messages to the /cmd_vel
topic. You created a publisher node in Part 1, and you can use this as a starting point.
-
In TERMINAL 2 ensure that you are still located within the
src
folder of yourros_training
package. You could usepwd
to check your current working directory, where the output should look like this:/home/student/catkin_ws/src/ros_training/src
If you aren't located here then navigate to this directory using
cd
. -
Create a new file called
move_circle.py
:[TERMINAL 2] $ touch move_circle.py
And make this file executable using the
chmod
command. -
Open up this file with Atom to edit it. Copy and paste the contents of the publisher node from Part 1 into the new
move_circle.py
file to get you started, if you want to. Then edit the code to achieve the following:- Make your TurtleBot3 move in a circle with a path radius of 0.5m.
- The Python node needs to publish
Twist
messages to the/cmd_vel
topic in order to make the TurtleBot move. Have a look at this usage example. - Remember (as mentioned earlier) that for our robots, the maximum linear velocity (
linear.x
) is 0.26 m/s, and the maximum angular velocity (angular.z
) is 1.82 rad/s. - Make sure that you code your
shutdown_function()
correctly so that the robot stops moving when the node is shutdown (viaCtrl+C
in the terminal that launched it).
-
Create a launch file to launch this and your
odom_subscriber.py
node simultaneously with a singleroslaunch
command. Refer to the launch file that you created in Part 1 for a reminder on how to do this.
You will also recall from the Introducing the Robots Section of this Wiki that the robot here is equipped with a LiDAR sensor which tells us how far away from the robot any obstacles are in it's environment. The KUKA iiwa Robot Arm that you will be working with during the challenge over the next two days is also equipped with a LiDAR sensor, so it's relevant to you now to know how to deal with the data that this sensor generates and what this all means. In this section we'll talk about this in relation to the TurtleBot3 robot, but the principles are the same and should be transferable to the robot arm system.
We're going to place the robot in a more interesting environment now, so you'll need to make sure that you close down the Gazebo simulation that is currently running. The best way to do this is to go to TERMINAL 1 and enter Ctrl+C
to close down the Gazebo processes. It may take a bit of time, but the Gazebo window will close down after 30 seconds or so of doing this. You should also stop any other nodes that might still be running too.
Return to TERMINAL 1 and enter the following to launch a new simulation:
[TERMINAL 1] $ roslaunch turtlebot3_gazebo turtlebot3_world.launch
A new Gazebo simulation should now be launched with a Turtlebot3 Waffle in a new arena:
In TERMINAL 2 we then need to launch a "Bringup" package for the TurtleBot3, which launches a number of key processes for the robot to make it fully functional:
[TERMINAL 2] $ roslaunch turtlebot3_bringup turtlebot3_remote.launch
In a new terminal instance (TERMINAL 3), enter the following:
[TERMINAL 3] $ rosrun rviz rviz -d `rospack find turtlebot3_description`/rviz/model.rviz
A new window should now open:
This is RViz, which is a ROS tool that allows us to visualise the data being measured by a robot in real-time. The red dots scattered around the robot represent laser displacement data which is measured by the LiDAR sensor. The LiDAR sensor spins continuously, sending out laser pulses as it does so, which are reflected back to the sensor from nearby objects. The time taken for the pulses to return can be used to determine how far away the object that reflected it is. Because the LiDAR sensor spins and performs this process continuously, a full 360° scan of the environment can be made. In this case (because we are working in simulation here) the data represents the objects surrounding the robot in its simulated environment, so you should notice that the red dots produce an outline that resembles the objects in the world that is being simulated in Gazebo.
Next, open up a new terminal instance (TERMINAL 4). Laser displacement data from the LiDAR sensor is published by the robot to the /scan
topic. We can use the rostopic info
command to find out more about the nodes that are publishing and subscribing to this topic, as well as the type of message that is being published to the /scan
topic:
[TERMINAL 4] $ rostopic info /scan
Type: sensor_msgs/LaserScan
Publishers:
* /gazebo (http://localhost:#####/)
Subscribers:
* /rviz_##### (http://localhost:#####/)
As we can see from above, /scan
messages are of the sensor_msgs/LaserScan
type, and we can find out more about this message type using the rosmsg info
command:
[TERMINAL 4]: $ rosmsg info sensor_msgs/LaserScan
std_msgs/Header header
uint32 seq
time stamp
string frame_id
float32 angle_min
float32 angle_max
float32 angle_increment
float32 time_increment
float32 scan_time
float32 range_min
float32 range_max
float32[] ranges
float32[] intensities
LaserScan
is a standardised ROS message type that any ROS Robot can use to publish data that it obtains from a Laser Displacement Sensor such as the LiDAR on the TurtleBot3. You can find the full definition of the sensor_msgs/LaserScan
message here.
ranges
is an array of float32
values (we know it's an array of values because of the []
after the data-type). This is the part of the message containing all the actual distance measurements that are being obtained by the LiDAR sensor (in meters).
Consider a simplified example here, taken from a TurtleBot3 robot in a much smaller, fully enclosed environment. In this case, the displacement data from the ranges
array is represented by green squares:
As illustrated in the figure, we can associate each data-point of the ranges
array to an angular position by using the angle_min
, angle_max
and angle_increment
values that are also provided within the LaserScan
message. We can use the rostopic echo
command to drill down into these elements of the message specifically and find out what their values are:
$ rostopic echo /scan/angle_min -n1
0.0
---
$ rostopic echo /scan/angle_max -n1
6.28318977356
---
$ rostopic echo /scan/angle_increment -n1
0.0175019223243
---
Compare the values here with the figure above.
Note: The
-n1
option here makes therostopic echo
command print only one message, this is appropriate for the message parameters that we are looking at here, because they don't change in real-time (they are constant for this particular sensor).
The ranges
array contains 360 values in total: a distance measurement at every 1° (an angle_increment
of 0.0175 radians) around the robot. The first value in the ranges
array (ranges[0]
) is the distance to the nearest object directly in front of the robot (i.e. at θ = 0 radians, or angle_min
). The last value in the ranges
array (ranges[359]
) is the distance to the nearest object at 359° (i.e. θ = 6.283 radians, or angle_max
) from the front of the robot. If, for example, we were to obtain the 65th value in the ranges
array, that is: ranges[65]
, we know that this would represent the distance to the nearest object at an angle of 65° (1.138 radians) from the front of the robot (anti-clockwise).
The LaserScan
message also contains the parameters range_min
and range_max
, which represent the minimum and maximum distances (in meters) that the LiDAR sensor can detect, respectively. You can use the rostopic echo
command to report these directly too.
Finally, use the rostopic echo
command again to display the ranges
portion of the LaserScan
topic message. Don't use the -n1
option now, so that you can see the data changing, from the terminal, in real-time, but use the -c
option to clear the screen after every message to make things a bit clearer. You might also need to maximise the terminal window so that you can see the full content of the array: with 360 values, the array is quite big, but is bound by square brackets []
to illustrate where it starts and ends, and there should be a ---
at the end of each message too, to help you confirm that you are viewing the entire thing.
The main thing you'll notice here is that there's way too much information, updating far too quickly for it to be of any real use! As you have already seen though, it is the numbers that are flying by here that are represented by red dots in RViz. Flick over to the RViz screen to have another look at this. As you'll no doubt agree, this is a much more useful way to visualise the ranges
data, and illustrates how useful RViz can be for interpreting what your robot can see in real-time.
What you may also notice is several inf
values scattered around the array. This represents sensor readings that were greater than the distance specified by range_max
, so the sensor couldn't report a distance measurement in these cases.
Stop all processes from running in all active terminals now by entering Ctrl+C
in each of them.
Using the LiDAR data and what we now know about how to make a robot move, we are now going to construct a new node that will make the robot turn on the spot until it is directly facing a pillar in an environment. Make sure that you have closed down all ROS processes in all terminals before you start on this.
-
Launch the empty world environment again using the following command in TERMINAL 1:
[TERMINAL 1] $ roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch
-
In Gazebo, use the "Box" tool in the top toolbar to place a box in front of the robot:
-
Use the "Scale Mode" button to resize the box and use the "Translation Mode" button to reposition it as you wish.
Now move the box behind the robot somewhere, but close enough so that the LiDAR
range_max
is not exceeded. -
In TERMINAL 2 make sure - once again - that you are still located within the
src
folder of yourros_training
package (/home/student/catkin_ws/src/ros_training/src
) or navigate to this directory usingcd
if you aren't. -
Create a new file called
search.py
. Into this, copy the contents of yourmove_circle.py
node from Exercise 4 as a starting point. -
You now need to build a subscriber into this node as well. Recall what you did in Exercise 2, adapting the approach here to this time extract
LaserScan
data from the/scan
topic. You will need to program a callback function for this subscriber, and you might want to have a look at this example of how to approach this. -
Your node should make the robot turn on the spot (slowly) until it detects (from the
LaserScan
data) that it is directly facing the box that you have placed nearby. Use ashutdown_function()
to make the robot stop and terminate the node once the correct condition is met. -
Want to see your code working on a real robot?! If you've managed to get this working in simulation and you'd like to see it in action on a real TurtleBot3 robot in the lab then follow the steps below to make a copy of your ROS package to send across to us!
-
Navigate to the Linux Home directory:
[TERMINAL 2] $ cd ~
-
Use the following command to make a
.tar
archive of your package:[TERMINAL 2] $ tar -cvf {my name}.tar ~/catkin_ws/src/ros_training
(replacing
{my name}
with your name!) -
To locate the file you have just created, open the Home directory in Windows Explorer:
[TERMINAL 2] $ explorer.exe .
-
Copy the
.tar
file from here to your desktop, then send it to us!
-
~~~ End of Part 2 ~~~
Once again, save the work you have done here by running the following script in any idle WSL-ROS Terminal Instance (incase you need to restore it later):
$ rosbackup.sh
Navigating This Wiki:
← Part 1: Getting to Grips with ROS (and Linux) |
Part 3: Robot Arms and the MoveIt Library →
ROS Training
UK-RAS Manufacturing Robotics Challenge 2021
Tom Howard & Alex Lucas | The University of Sheffield