0% found this document useful (0 votes)
12 views

A Vision Based Algorithm for a Path Following Problem

This paper presents a novel vision-based algorithm for path following in Unmanned Aerial Vehicles (UAVs), utilizing a pure pursuit approach combined with image processing techniques. The algorithm processes images from a downward-facing camera to detect and follow a predefined path, demonstrating its effectiveness through MATLAB simulations. The code is open-source, allowing for replication and further exploration of the proposed solution.

Uploaded by

Vedant Kini
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

A Vision Based Algorithm for a Path Following Problem

This paper presents a novel vision-based algorithm for path following in Unmanned Aerial Vehicles (UAVs), utilizing a pure pursuit approach combined with image processing techniques. The algorithm processes images from a downward-facing camera to detect and follow a predefined path, demonstrating its effectiveness through MATLAB simulations. The code is open-source, allowing for replication and further exploration of the proposed solution.

Uploaded by

Vedant Kini
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

A Vision-Based Algorithm for a Path Following Problem

Mario Terlizzi1 , Giuseppe Silano2 , Luigi Russo1 , Muhammad Aatif1 ,


Amin Basiri1 , Valerio Mariani1 , Luigi Iannelli1 , and Luigi Glielmo1

Abstract— A novel prize-winner algorithm designed for a


path following problem within the Unmanned Aerial Vehicle
(UAV) field is presented in this paper. The proposed approach
exploits the advantages offered by the pure pursuing algorithm
to set up an intuitive and simple control framework. A path for
a quad-rotor UAV is obtained by using downward facing camera
arXiv:2302.04742v1 [cs.RO] 9 Feb 2023

images implementing an Image-Based Visual Servoing (IBVS)


approach. Numerical simulations in MATLAB® together with
the MathWorks™ Virtual Reality (VR) toolbox demonstrate the
validity and the effectiveness of the proposed solution. The code
is released as open-source making it possible to go through any Fig. 1: Parrot Mambo quad-rotor [18].
part of the system and to replicate the obtained results.
Index Terms— Path following, UAV, multi-rotor, virtual real-
ity, image processing the computational burden is at minimum, e.g., lightweight
machine learning solutions have been recently proposed to
I. I NTRODUCTION tackle the problem [8], [9], the time required to set up
Path following is a relevant application problem within the algorithms makes them difficult to apply in real world
the Unmanned Aerial Vehicle (UAV) field. For instance, in applications. For all such reasons, it is of interest to have
precision agriculture scenarios having good path following low computational intensity algorithms, possibly without any
algorithms is a fundamental requirement to preserve high prior knowledge of the surrounding environment, able to
productivity rates and plant growth [1]. In civilian applica- provide references to the path following in a given time
tions, monitoring of power lines can be encoded as a path window.
following problem between several target regions that need to Moving at the path following level, much of the state-
be inspected [2]. Whatever the field of application is, drones of-the-art solutions [10], [11] rely on the use of nonlinear
have to follow a path to accomplish the mission specifications guidance law [12], vector field [13], and pure pursuit [14]
safely and successfully. algorithms due to their simple implementation and ease
Looking at the path following problem, it can be divided of use. Although the choice of path planner is application
into two parts: detection and path following [3], [4]. As sensitive, general considerations can be provided. The per-
regards the path detection, Hough transform [5] and its fur- formance of nonlinear guidance law degrades as the target
ther developments [6], [7] are considered the most valuable acceleration changes rapidly introducing a not negligible
solutions in the literature to cope with the task. However, delay in the trajectory generation. An adequate knowledge
such a transformation demands a high computational effort of the target velocity and acceleration is required to avoid
making it hard to run on-board the aircraft when battery instability issues [12]. On the other hand, a vector field
and processing constraints are tight. These requirements solution prevents from such oscillations problems, but it is
are increasingly stringent when considering Micro Aerial inherently characterized by a high computational effort [13].
Vehicles (MAVs), where the sensor equipment and the ve- Besides, a pure pursuit approach is a suitable solution when
hicle dimensions are minimal. On the other hand, when tracking error and computational effort are critical. The
position of a look-ahead point is set up at the beginning of
This project was partially funded by the ECSEL Joint Undertaking
(JU) research and innovation programme AFarCloud and COMP4DRONES
the mission, and then updated at each step following some
under grant agreement no. 783221 and no. 826610, respectively, and by tracking criteria [15]–[17]. The objective is to reduce the
the European Union’s Horizon 2020 research and innovation programme distance between the current position and the look-ahead
AERIAL-CORE under grant agreement no. 871479. The JU receives
support from the European Union’s Horizon 2020 research and innovation
position.
programme and Spain, Germany, Austria, Portugal, Sweden, Finland, Czech In this paper, we propose a novel winner-prize algo-
Republic, Poland, Italy, Latvia, Greece, and Norway. rithm designed to deal with the path following problem
1 Mario Terlizzi, Luigi Russo, Muhammad Aatif, Amin Basiri, Va-
in the context of the IFAC2020 MathWorks Minidrone
lerio Mariani, Luigi Iannelli, and Luigi Glielmo are with the Depart-
ment of Engineering, University of Sannio in Benevento, Benevento, competition [19]. The framework combines the advantages
Italy (email: {mterlizzi, luirusso, maatif, basiri, provided by the pure pursuit algorithm and some simple
vmariani, luiannel, glielmo}@unisannio.it). image processing to detect and to track a pre-established
2 Giuseppe Silano is with the Faculty of Electrical Engineer-
ing, Czech Technical University in Prague, Czech Republic (email: path. The lightweight and ease of implementation of the
giuseppe.silano@fel.cvut.cz) proposed solution allow the deployment on low compu-
Fig. 3: An illustrative example of the proposed vision-based
path following algorithm works. The red point d represents
the drone position, while the orange point w depicts the VTP.

Image ex Path xw uT , uϕ
Processing ey yw Controller uϑ , uψ Drone
Planner
Fig. 2: Snapshot extracted from the virtual scenario. A
dashed circle is used to indicated the drone position. Sec. III-A Sec. III-B Competition Organizers

IMG

tational capacity MAVs, such as the Parrot Mambo [18]


(see, Figure 1) considered as a testbed for the application. Fig. 4: Control system architecture. From left to right: the
Numerical simulations carried out in MATLAB together with image processing, path planner, controller, and drone blocks.
the MathWorks Virtual Reality (VR) toolbox [20] show the
validity and the effectiveness of the proposed approach.
Moreover, the code is provided as open-source [21] making it is the pure pursuit algorithm with a curvature of infinite
it possible to go through any part of system and to replicate radius) [22], i.e., moving the vehicle from its current position
the obtained results. to the goal position1 . In Figure 3 an illustrative example of
The paper is organized as follows. Section II presents the how the algorithm works is depicted.
problem, while in Sec. III the vision-based path following Contrary to the pure pursuit algorithm, the proposed
algorithm is described. Numerical simulations are reported approach exploits the intrinsic characteristics of the multi-
in Sec. IV. Finally, conclusions are drawn in Sec. V. rotor UAVs motion: differently from ground vehicles with
steering wheels, drones can follow a path without modifying
II. P ROBLEM D ESCRIPTION their heading. Such an assumption allows reducing the time
The work presented here finds an application within the to accomplish the task by removing the control of the heading
IFAC2020 MathWorks Minidrone competition [19], where from the purposes of the path follower.
the use of a model-based design approach is the aim of In Figure 4 the whole control scheme architecture is
the specific contest. Path error and mission time are used as reported. The algorithm is mainly divided into two parts: (i)
evaluation metrics for the algorithm. The whole process is the the Image Processing System (IPS) deals with extracting the
following: a quad-rotor UAV follows a pre-established red red path from the camera images, providing the errors along
path by using its downward facing camera to get feedback the x- and y-axis of the camera frame between the current
from the environment. Images are updated according to drone position and the VTP point, and recognizing the
the position and orientation of the vehicle simulated in the End-Marker for the landing operations; while, (ii) the Path
MATLAB VR world. No prior knowledge on the path and Planner (PP) figures out the path following problem by
the surrounding scenario is given. The drone takes off and computing the new position w of the drone in the world
starts its motion looking at the path, and the mission stops frame [20, Sec. V] implementing an Image-Based Visual
with the recognition of an end-marker. At that time, the drone Servoing (IBVS) scheme. The control algorithm computes
lands staying within the delimited area. Figure 2 shows the the commands uT , uϕ , uϑ , and uψ that should be given to
considered scenario. the drone in order to update its position and orientation in
accordance to the PP references.
III. V ISION -BASED PATH F OLLOWING A LGORITHM The overall mission is divided into four parts: Take off,
Following, End-Marker, and Landing. A decision-making
The vision-based path following algorithm combines the
process has been implemented to achieve the competition
advantages offered by the pure pursuit algorithm [22] with
objectives triggering the system from a state to another, as
that of an easy image processing system to cope with the
depicted in Figure 5. For each frame, the IPS accesses the
task. The algorithm starts selecting a target position ahead
system status and plan the next action (i.e., landing, follow-
of the vehicle and that has to be reached, typically on a
ing, etc.). The drone starts taking off from its initial position
path. The framework is based on the following operations: (i)
looking at the path. Once the vehicle reaches the hovering
given the actual position d = (xd , yd )> ∈ R2 where the UAV
position, the IPS detects the path and the state machine enters
is located, a Virtual Target Point (VTP) is set over the track
at w = (xw , yw )> ∈ R2 ; then, (ii) the quad-rotor is com- 1 The quad-rotor is assumed to fly at a fixed height along the entire
manded to reach the VTP along a straight line (essentially mission.
Hovering Flag VTP

Hovering ∧ Flag VTP


start 1 2

Flag VTP ∧ Flag marker

4 3
Centered

Centered
Fig. 5: State machine implemented. State 1: Take-off. State
2: Following. State 3: End-Marker. State 4: Landing.
Fig. 6: Original frame (upper left), converted and binarized
frame (upper right), and eroded frame (lower).
in the Following state, hence the path following starts. As
soon as the IPS detects the End-Marker, the state machine
exits from the Following state and goes into the End-Marker In Figure 6 the overall process is reported for a single sample
state. At this stage the mission stops, and the drone starts the frame.
landing. In the following subsections the implementation of Then, the obtained reference path is used in a twofold
the image processing system and path planner modules are way: (i) to identify a new VTP belonging to the track; (ii)
detailed. to detect the landing marker. The two tasks are described in
the pseudocode reported in Algorithm 1.
A. Image processing system Looking at the algorithm, the first three functions (i.e.,
channelConv, binarization, and erosion) take
Starting from the camera frames, the Image Processing
care of extracting the path information from the frame. Then,
System takes care of separating the features of the pre-
the detectTrack and detectMarker functions deal
established path from that of the environment.
with raising a flag when the path (Flag VTP) or the End-
The IPS receives frames of width W and height H from the
Marker (Flag marker) are detected. The path following
camera sensor at each TIPS = 0.2 s, i.e., the camera sampling
algorithm starts with the IPS that computes the errors (ex and
time. The image format is RGB with 8 bits for each color
ey ) between the drone position and the VTP point for the PP
channel. The path is 0.01 m in width, while the landing
by using a circular arc mask centered in the drone Center of
marker is circular with a diameter of 0.02 m. The path color
Mass (CoM)2 with thickness Rmax − Rmin 3 .
is red, and this information is taken into consideration in all
In Figure 7, the arc mask considering the VTP po-
the elaborations to filter out the background scenario. The
sition at time tk is depicted, where tk denotes the k-
procedure consists of the following steps: first, the RGB
element of the time interval vector defined as t =
frame is converted into an intensity level frame represen-
(0, TIPS , . . . , NTIPS )> ∈ RN+1 , with k ∈ N0 . The orientation
tation as follows
angle ϑ = arctan2(xVTP , yVTP ) is calculated with respect to
fG (n, m) fB (n, m) the frame coordinates, where the arctan2 function is the
F(n, m) = fR (n, m) − − , (1)
GG GB four-quadrant inverse of the tangent function. A portion Θ
of the arc mask is established by taking into account the
where the pair (n, m) represents the pixel located at row
previous VTP’s orientation. In particular, we set up two
n ∈ {1, 2, . . . , H} and column m ∈ {1, 2, . . . ,W } of the image
semi-arcs with width Θ/2, namely Field of View (FoV), in
frame and fi , with i ∈ {R, G, B}, provides the intensity level
counter-clockwise and clockwise directions from ϑ . Then,
representation of the corresponding red, green and blue
the arc mask is applied to the eroded image obtaining
channels. An heuristic approach was used to tune the GG,
the VTP point at tk+1 . The function VTP calculates xVTP ,
GB ≥ 1 parameter values. These parameters help to detect the
yVTP , and ϑ which represent the frame coordinates and angle
pixels belonging to the path. Further, a binarization process
orientation of the VTP at tk+1 , respectively. Subsequently the
based on a KT threshold value refines the process removing
corresponding errors with respect to the center of mass, i.e.,
artifacts from the elaboration. The binarized frame can be
ex and ey , are computed inside the frame coordinates. Finally,
described by the binary function Fbin : (n, m) → {0, 1} whose
output is one when the pixel belongs to the path and zero 2 The assumption that the CoM being in the center of the reference frame,
otherwise. Finally, an erosion operation is performed through i.e., xCoM = H/2 and yCoM = W/2, is taken into consideration.
a square kernel to shrink and regularize the binarized frame. 3R
max and Rmin are the outer and inner radius, respectively.
Algorithm 1 Image Processing System
1: IMG ← channelConv(IMG),
2: IMG ← binarization(IMG),
3: IMG ← erosion(IMG),
4: Flag VTP ← detectTrack(IMG),
5: Flag marker ← detectMarker(IMG)
6: if Flag VTP then
7: xVTP , yVTP ← vtp(frame)
8: ex ← xVTP − xCoM
9: ey ← yVTP − yCoM Fig. 8: Frame after the application of the Arc mask (left).
10: else Extracted pixels belonging to the path (right).
11: if Flag marker then
12: xMARK , yMARK ← cgMarker(frame)
13: ex ← xMARK − xCoM
14: ey ← yMARK − yCoM
15: return ex , ey , Flag VTP, Flag marker

Fig. 9: Original (left) and eroded frames (right) of the End-


VTP
Marker are reported.
Θ/2
Θ/2
a rate faster than the outer loop, i.e., the IPS. In our case,
d ϑ x the PP runs at 200 Hz (TPP = 5 × 10−3 s) while the IPS runs
at 2 Hz (TIPS = 0.2 s). These are a standard solution in the
literature for quad-rotors control design [23].
As described in Sec. III, the path following stops with the
Fig. 7: Arc mask. The drone position (red), the previous VTP detection of the End-Marker. At that time, the IPS imple-
(green), and the pre-established path to follow (purple) are ments a toggle switch behavior raising the Flag marker
reported. flag while holding low the Flag VTP flag. This mutually
separates the Following and Landing phases avoiding insta-
bility issues. The pseudocode of the proposed algorithm is
the Flag VTP and the ex and ey values are provided as input reported in Algorithm 2 with parameter values detailed in
to the PP at each TIPS . Figure 8 shows the result of the entire Table I.
process setup. In Appendix I, we show how α can be set to control the
It is worth noticing that when the landing marker is velocity of vehicle along the entire mission. Therefore, the
detected and no other VTP point is found in the frame, proposed Vision-Based Path Following algorithm makes it
the IPS triggers the state machine in the End-Marker state. possible not only to generate the spatial coordinates xw and
Here, the new main task of the IPS is to obtain the position of yw using a IBVS scheme but also to set the velocity during
the End-Marker within the frame coordinates. An additional the entire mission.
erosion process is performed by using a circular kernel, as
depicted in Figure 9. IV. N UMERICAL R ESULTS
To demonstrate the validity and effectiveness of the pro-
B. Path planner posed framework, numerical simulations have been carried
The Path Planner is designed to compute the position out by using the 2019b release of MATLAB equipped with
of the VTP point w = (xw , yw )> maintaining a constant MathWorks Virtual Reality toolbox [24] and Parrot support
altitude (zH ) while following the path. Roughly speaking, package for Simulink [18]. The video available at [25]
the PP computes the spatial coordinates xw and yw trying to illustrates in a direct way how the system works, i.e., the
reduce the errors, i.e., ex and ey , between the drone position ability of the quad-rotor UAV to follow the pre-established
and the VTP. These values are later used by the drone red path and to land on the End-Marker. In addition, the
controller to tune the command signals uT , uϕ , uϑ , and uψ , as video shows the behavior of the IPS and PP that never lose
described in Figure 4. The proposed path planner is based the path during the entire mission.
on Proportial-Integral control loops. As a common rule in In Figure 10 a comparison of the system performance by
cascade structure, the inner loop, i.e., the PP, is regulated at using various values of α is reported. As can be seen from
3 3
Algorithm 2 Path Planner
path path
1: ex , ey , Flag VTP, Flag marker 2 α = 0.05, Ts = 30 s 2 α = 0.04, Ts = 34 s

if Flag VTP then

Y [m]

Y [m]
2:
3: xk+1 ← xk + αex 1 1

4: yk+1 ← yk + αey
0 0
5: zk+1 ← zH −3.5 −3 −2.5 −2 −1.5 −1 −0.5 0 −3.5 −3 −2.5 −2 −1.5 −1 −0.5 0
6: if Flag marker then X [m] X [m]

7: if (ex = 0 ∧ ey = 0) then (a) (b)


8: xk+1 ← xk 3 3
α = 0.05 α = 0.04 α = 0.03
9: yk+1 ← yk path
α = 0.03, Ts = 47 s
2.8
2
10: zk+1 ← 0 2.6

Y [m]

Y [m]
11: else 1 2.4
12: xk+1 ← xk + β ex 2.2
13: yk+1 ← yk + β ey 0
2
14: zk+1 ← zH −3.5 −3 −2.5 −2 −1.5 −1 −0.5 0 −3.7 −3.65 −3.6 −3.55 −3.5 −3.45 −3.4
X [m] X [m]
15: xw ← xk+1 , yw ← yk+1 , zw ← zk+1
(c) (d)
16: return xw , yw , zw
Fig. 10: Trajectory plots. From left to right: the desired and
TABLE I: Parameter values. the drone paths for various values of α are represented. The
Sym Value mission time Ts and a comparison between the considered α
Sampling time TPP 5 × 10−3 s values are also reported.
Sampling time TIPS 0.2 s
PP constant β 18 × 10−3 m pixel−1
Frame height H 120 pixel 0.4
Frame width W 160 pixel
Velocity [m s−1 ]

Outer radius Arc mask Rmax 28 pixel 0.2


Inner radius Arc mask Rmin 26 pixel
IPS constant GB 2 0
IPS constant GG 2
IPS threshold KT 150 −0.2
FOV Arc mask ϑ 2.3 rad
Drone height zH 1m −0.4 vD vy vx
0 5 10 15 20 25 30 35
the plots, the larger α is, the lower the mission time (Ts ) is. Time [s]
On the other hand, the lower the mission time is, the greater
the path error is. Looking at the zoom plot (see, Figure 10d) Fig. 11: Velocity plot.
it is even clearer how the system performance degrades with
increasing α value, and these are all the more evident as
the path is angular. For the considered scenario, an heuristic the competition requirements successfully. Numerical simu-
approach was used to tune the α and β parameter values. lations carried out in MATLAB together with the MathWorks
Figure 11 depicts the drone velocity vx and vy along the x- Virtual Reality toolbox and the Parrot support package for
and y-axis, respectively, and the norm of the drone velocity Simulink demonstrated the validity and the effectiveness of
vD . As described in Sec. III-B and detailed in Appendix I, the the proposed approach. The software has been released as
norm of the drone velocity remains approximately constant open-source making it possible to go through any part of the
while following the path. The presence of spikes might be system and to replicate the obtained results. Future work will
due to the coupling effects of the drone xy dynamics even include the integration of more challenging features, such as
though xw and yw references have not been modified yet obstacle avoidance and 3-D reference generation, and lead
(see, Figure 10). Such coupling effects are probably caused to field experiments. Furthermore, the effects of the drone
by the asymmetric positioning of the rotors with respect dynamics on the performance of the vision-based system will
to the principal axis and the effect of the discrete image be also explored.
pixelization.
R EFERENCES
V. C ONCLUSION
[1] B. Maik and d. E. Pignaton, “A UAV guidance system using crop
In this paper, a prize-winner algorithm designed for a row detection and line follower algorithms,” Journal of Intelligent &
path following problem within the IFAC2020 Mathworks Robotic Systems, pp. 1–17, 2019.
Minidrone Competition has been presented. In particular, a [2] G. Silano, T. Baca, R. Penicka, D. Liuzza, and M. Saska, “Power Line
Inspection Tasks with Multi-Aerial Robot Systems via Signal Tem-
lightweight and easy of implementation solution was set to poral Logic Specifications,” IEEE Robotics and Automation Letters,
generate the spatial coordinates of the VTP point to fulfill vol. 6, no. 2, pp. 4169–4176, 2021.
[3] B. Dahroug, J.-A. Séon, A. Oulmas, T. Xu, B. Tamadazte, N. An- [25] M. Terlizzi, “MATLAB Minidrone Competition IFAC20,” YouTube.
dreff, and S. Régnier, “Some Examples of Path Following in Micro- [Online]. Available: https://youtu.be/9VySp0j-1hc
robotics,” in International Conference on Manipulation, Automation
and Robotics at Small Scales, 2018, pp. 1–6. A PPENDIX I
[4] M. A. Rafique and A. F. Lynch, “Output-Feedback Image-Based Visual
Servoing for Multirotor Unmanned Aerial Vehicle Line Following,” Let us consider a continuous-time dynamical system
IEEE Transactions on Aerospace and Electronic Systems, vol. 56, H and its discrete time version xk+1 = f (xk , uk ), where
no. 4, pp. 3182–3196, 2020. xk , xk+1 ∈ X ⊂ Rn are the current state and the next state
[5] R. O. Duda and P. E. Hart, “Use of the Hough transformation to detect
lines and curves in pictures,” Communications of the ACM, vol. 15, of the system, respectively, u ∈ U ⊂ Rm is the control input.
no. 1, pp. 11–15, 1972. Let us consider the PP algorithm implementation detailed in
[6] D. Dagao, X. Meng, M. Qian, H. Zhongming, and W. Yueliang, “An Algorithm 2. Hence, the next state of the system xk+1 and
improved Hough transform for line detection,” in 2010 International
Conference on Computer Application and System Modeling, vol. 2,
yk+1 along the x- and y-axis can be written as follows:
2010, pp. V2–354.
[7] S. Du, B. J. van Wyk, C. Tu, and X. Zhang, “An Improved Hough
xk+1 = xk + αexk , yk+1 = yk + αeyk , (2)
Transform Neighborhood Map for Straight Line Segments,” IEEE
Transactions on Image Processing, vol. 19, no. 3, pp. 573–585, 2010. respectively. After some simple algebra, we can write:
[8] V. Nhan, J. Robert, and R. Davide, “LS-Net: Fast Single-Shot Line- xk+1 − xk αexk yk+1 − yk αeyk
Segment Detector,” Machine Vision and Applications, vol. 12, no. 32, = , = , (3)
2020. TPP TPP TPP TPP
[9] J. Tang, S. Li, and P. Liu, “A review of lane detection methods based and hence,
on deep learning,” Patter Recognition, vol. 111, pp. 1–15, 2021.
[10] P. Sujit, S. Saripalli, and J. B. Sousa, “An evaluation of UAV path αexk αeyk
following algorithms,” in 2013 European Control Conference, 2013,
vx ≈ = α̃exk , vy ≈ = α̃eyk , (4)
TPP TPP
pp. 3332–3337.
[11] G. V. Pelizer, N. B. Da Silva, and K. R. Branco, “Comparison of with α̃ = α/TPP .
3d path-following algorithms for unmanned aerial vehicles,” in 2017 Knowing that exk and eyk are by definition the projections
International Conference on Unmanned Aircraft Systems, 2017, pp.
498–505.
over a circle along the x- and y-axis of the VTP with an
[12] S. Keshmiri, A. R. Kim, D. Shukla, A. Blevins, and M. Ewing, “Flight angle ϑk , we can write
Test Validation of Collision and Obstacle Avoidance in Fixed-Wing
UASs with High Speeds Using Morphing Potential Field,” in 2018
Rmax + Rmin
exk = sin ϑk ,
International Conference on Unmanned Aircraft Systems, 2018, pp. 2
589–598. (5)
Rmax + Rmin
[13] T. Tuttle, T. T. Moleski, and J. Wilhelm, “Multi-Rotor Path-following eyk = cos ϑk ,
Performance using Vector Field Guidance and Velocity Control,” in 2
AIAA Scitech 2021 Forum, 2021. and thus,
[14] A. S. Baqir and A. A. Ammar, “Navigation of Mini Unmanned Aerial
Vehicle in Unknown Environment,” IOP Conference Series: Materials
q Rmax + Rmin
Science and Engineering, vol. 745, 2020. VD = v2x + v2y ≈ α̃. (6)
2
[15] A. Gautam, P. B. Sujit, and S. Saripalli, “Application of guidance laws
to quadrotor landing,” in 2015 International Conference on Unmanned Hence the parameter α controls the drone velocity.
Aircraft Systems, 2015, pp. 372–379. 
[16] D. M. Xavier, B. F. S. Natassya, and R. Branco Kalinka, “Path-
following algorithms comparison using Software-in-the-Loop simu-
lations for UAVs,” in 2019 IEEE Symposium on Computers and
Communications, 2019, pp. 1216–1221.
[17] G. Silano, P. Oppido, and L. Iannelli, “Software-in-the-loop simulation
for improving flight control system design: a quadrotor case study,”
in IEEE International Conference on Systems, Man and Cybernetics,
2019, pp. 466–471.
[18] MathWorks, “Simulink Support Package for Parrot Minidrones.”
[Online]. Available: https://www.mathworks.com/matlabcentral/
fileexchange/63318-simulink-support-package-for-parrot-minidrones
[19] Mathworks, Mathworks Minidrone Competition IFAC20. [Online].
Available: https://it.mathworks.com/academia/student-competitions/
minidrones/ifac-2020.html
[20] G. Silano and L. Iannelli, “MAT-Fly: An Educational Platform for
Simulating Unmanned Aerial Vehicles Aimed to Detect and Track
Moving Objects,” IEEE Access, vol. 9, pp. 39 333–39 343, 2021.
[21] M. Terlizzi, “Vision Based Pure Pursuing Algorithm,” GitHub
repository. [Online]. Available: https://github.com/mar4945/Vision-
Based-Pure-Pursuing-Algorithm
[22] R. C. Coulter, “Implementation of the pure pursuit
path tracking algorithm,” Carnegie-Mellon UNIV Pittsburgh
PA Robotics INST, Tech. Rep., 1992. [Online]. Avail-
able: http://www.enseignement.polytechnique.fr/profs/informatique/
Eric.Goubault/MRIS/coulter r craig 1992 1.pdf
[23] T. N. Dief and S. Yoshida, “Review: Modeling and Classical Con-
troller Of Quad-rotor,” International Journal of Computer Science and
Information Technology & Security, vol. 5, no. 4, pp. 314–319, 2015.
[24] MathWorks, “Simulink 3D Animation toolbox.” [Online]. Available:
https://www.mathworks.com/products/3d-animation.html

You might also like