Lab 2: Mobile Robot Path Tracking Using Odometry 2.12: Introduction To Robotics Fall 2016
Lab 2: Mobile Robot Path Tracking Using Odometry 2.12: Introduction To Robotics Fall 2016
Lab 2: Mobile Robot Path Tracking Using Odometry 2.12: Introduction To Robotics Fall 2016
Instructions:
1. When your team is done with each task, call a TA to do your check-off.
2. After the lab, zip your files and make a backup using online storage/flash drive.
1 Introduction
In the previous lab we learned how to control the speed of the robot wheels. Today’s goal is to
1. estimate robot’s change in pose (∆x, ∆y, ∆θ) using the encoders in the motors of the wheels;
2. move the robot using delta pose (∆x, ∆y, ∆θ) commands.
To do so, we will apply the wheeled-robot kinematics from lecture. The lessons you will learn
from this lab are very crucial to navigating your robot on the final project arena. In this handout,
we will denote the mobile robot platform as robot.
On the implementation side, this lab utilizes two essential software development tools: git [1]
and ROS (Robot Operating System) [2]. Commands to use them are provided and we won’t dive
into details in this lab. You are encouraged to browse the referenced websites to learn about them.
We will work with ROS more in the next lab. Also, this lab uses the object-oriented programming
to structure the Arduino code. If you are not familiar with this, please read these two tutorials
[3, 4].
From now on, we will denote the path to the me212lab2 folder, ~/me212lab2, as LAB2. Re-
member that ~ is an alias of user’s home folder in Ubuntu, which is /home/robot/ in our case.
1
2.1 Folders and files
The me212lab2 folder contains all the files required for this lab. The overall structure follows
the ROS catkin build system [5] and is shown below. Unimportant files and folders are ignored
for brevity. However, don’t delete them because they are still required for the package to work
properly.
The folder hierarchy helps organize a large project that contains several packages. Package is a
term for units of files related to one idea. The files can be code for libraries and programs, as well
as config files. Now let’s focus on the content inside package me212_robot, which is for our robot
platform. Below some important files are listed.
• src/controller :
• scripts/me212_robot.py : read odometry from Arduino and publish them to ROS network
on PC.
• launch/viz.launch : a ROS launch file that launches tools to visualize the above messages.
In this lab, you only need to modify helper.cpp and controller.ino, and refer to helper.h
for declarations.
Item 1 and 5 were done in Lab 1, and example code for them are provided as class EncoderMeasurement,
and class PIController. You should read their implementation in helper.cpp. You may change
the gain of the controller and make sure that positive direction of encoder and motor corresponds
to forward wheel motion. Item 3 is also provided in class SerialComm. Item 2 and 4 are what
you are going to complete today.
2
3 Task 1: Mobile Robot Odometry
Odometry is to estimate the change in robot pose over time using changes in encoder values. First,
let’s define some variables and constants:
• (x, y, θ): estimated robot pose using odometry relative to the world frame (starting frame),
• φR , φL : the right and left wheel net rotation (positive sign is in the robot forward direction),
2r
2b
Figure 1: Definition of the robot dimensions and robot frame. Robot forward direction matches
the x axis of the robot frame.
Conversion 1 was done in Lab 1, and the code is provided in class EncoderMeasurement. However,
make sure to check which motor model (26 or 53) your robot has, and configure it correctly when
creating an EncoderMeasurement object in controller.ino. For conversion 2, you will implement
it in class RobotPose. Recall from the lectures that we can first compute θ̇ using the following
expression.
r
θ̇ = φ̇R − φ̇L . (1)
2b
However, in our robot platform, we can only measure φ’s at discrete timestamps. Thus we rewrite
(1) as (2):
r
∆θ = (∆φR − ∆φL ) . (2)
2b
To estimate current robot orientation θ(t) we use:
3
" # " # " #" #
x(t) x(t − dt) r cos θ(t) cos θ(t) ∆φR (t)
= + (4)
y(t) y(t − dt) 2 sin θ(t) sin θ(t) ∆φL (t)
All relevant variables have been defined in class RobotPose and are listed below. You do not
need to define your own variable. The following variables are of float type.
In order to test your system, the wheel velocities are set to some constants in section 4 of
controller.ino.
pathPlanner . desiredWV_R = 0.2;
pathPlanner . desiredWV_L = 0.2;
Use the ROS visualization tool, rviz, to visualize your robot pose from odometry. Open a
terminal (Ctrl-Alt-T), and enter the following commands without the leading $.
$ roslaunch me212_robot viz . launch
This will connect to Arduino through serial communication and restart the Arduino program. The
program can be stopped by Ctrl-C. Note that every initiation of serial communication will restart
the Arduino program. Then you should see the robot moving in rviz as in Figure 2.
1. Determine what stage of the track your robot is in. We suggest to write an if/else statement
on robotPose.pathDistance. The U trajectory can be divided into 3 stages: 1) straight line,
2) semi-circle, and 3) straight line again.
3. Compute the desired R/L wheel velocities from a specific motion velocity and curvature. It
is up to you to ensure that the resulting motor velocities do not exceed the maximum speed
of the motors. A function for this purpose has been declared for you:
4
Figure 2: An example view of rviz showing the current robot coordinate frame, map (world) frame,
and the U trajectory that we will be implement in the next task. The red, green, and blue bars of
the frame correspond to x, y, and z axes respectively. The robot frame is defined at the center of
the two motors and projected to the ground, with x axis pointing forward, y pointing to the left,
and z pointing up. Initially, robot frame starts at the map frame.
Goal
1m
0.25 m
Start
1m
You need to complete this function using the following two equations.
φ̇R − φ̇L
κ= , (5)
b(φ̇R + φ̇L )
where κ is the curvature, the inverse of the radius of a circle, as shown in Figure 4.
(φ̇R + φ̇L )
robotVel = |(ẋ, ẏ)| = r . (6)
2
Note that desiredWV_{L/R}=r · φ̇{L/R} , and that you need to change usePathPlanner to true
in controller.ino to test your navigation policy. For testing, if you just want to do dry run, make
sure the robot wheels are off the table. If you want it to move on the floor, make sure there is no
external wires connected to the robot. Plug the HDMI cable to the onboard screen so that you can
still operate it. Hold the robot by the 80/20 frame to move it.
5
Figure 4: Curvature of a circle is defined to be the reciprocal of the radius. Image source: Wikipedia.
Question 1 Given the curvature κ and the maximum wheel speed maxMV, what is the maximum
robotVel that you can drive your robot at?
Question 2 Can your robot start and end at the same pose for a figure 8 trajectory?
Figure 5: A three-point turn is for turning a vehicle around in a narrow space. Image source:
Wikipedia.
References
[1] Website of Git. [Online]. Available: https://git-scm.com/