Flight Tests For The Study of Racing Drones-2022

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

in: XLIII Automatica 2022, Logrono, La Rioja, 7-9 September 2022

THE TRAM-FPV RACING Open Database. Sequences


complete indoor flight tests for the study of racing drones.

J.M. Castiblanco, S. Garcia-Nieto, D. Ignatyev, X. Blasco


jocasqui@doctor.upv.es, sgnieto@isa.upv.es, D.Ignatyev@cranfield.ac.uk, xblasco@isa.upv.es
Universitat Politècnica de València, Camı̀ de Vera, s/n, 46022 València, Valencia
Cranfield University, College Rd, Cranfield, Wharley End, Bedford MK43 0AL, United Kingdom.

Abstract (BD1) [2] stores details on aircraft with medium


speeds of close to 7.0 (m/s). The EuRoC data
This paper offers the TRAM-FPV Racing open (BD2) [7] makes unique use of a laser system for
database. It results from indoor flights with five vehicle tracking. The Urban Mav (BD3) [44] data
(5) racing drones at Cranfield University (UK). comes from flights in urban areas. At the same
Strictly, at the Flight Arena. It is one of the time, the KumarRobotics data (BD4) [68] holds a
largest indoor flight fields in the world for research Matlab file that aligns its GPS measures with an
goals. The flight data were recorded using an opti- odometry system. Finally, the UZH-FPV (BD5)
cal measurement system (OMS). The position and [15] data stores info from FPV cameras built into
orientation info in the vector space of the drone racing drones.
models can be found in the database. It has read-
The flight records in these databases are usually
ings from accelerometers and gyroscopes. Besides
made with only one kind of drone. It mainly
this, the heading angles recorded by inertial unit
holds data from satellite signals. Also, it has read-
(IMU) sensors are in it. The most frequent use
ings from inertial sensors and systems outer to
is to fit the data output by sensor fusion. At
the drone [8, 26, 30]. The drones are often for
the same time, those are used to develop sensors.
general goals. It means they are not designed
Also, those are embodied in the drones to estimate
for a clearly defined application. However, this
their current state vector. However, their scope is
kind of racing drone has burst onto the scientific
vast. It can be used, for example, to design non-
scene. Nowadays could find multiple studies fo-
linear mathematical models or forge trajectories.
cus on their fast motion [9, 41]. These studies
This paper was published in the Jornadas XLIII
have linked the shape of the airframe. Also, flight
de automática 2022/Spain. The author’s version,
dynamics, in general, are pretty attractive to re-
translated into Spanish, can be found at the refer-
searchers. In addition, this leads to autonomous
ences [77].
control and machine learning approach: Situa-
Keywords: Racing drones, Database, Trajec- tions with static or dynamic obstacles. Thus,
tory, Guidance, GPS-denied, IMU, Navigation, new sorts of databases have started to be worked.
Autonomous, Simulation. Those take into account data on a racing drone.
Their fast and aggressive motion [58] makes them
distinct from classic drones.
1 Introduction This paper presents the open database TRAM-
FPV Racing. It is created to study the motion
A wide variety of databases hold info from flight of racing drones. Section 2 briefly explains the
tests of drones. They are often used for machine vision system used to capture motion. Also, to
learning. Strictly, the data is used to tune algo- obtain the 3D orientation of the body. Section 3
rithms. For example, to estimate vehicle states. defines the calibration of the Flight Arena. The
Also, for the guidance and control of the aircraft. racing drones used for testing flight are presented.
In addition, basic control methods that combine
these drones are shown. Section 4 details the flight
Table 1: Other datasets process to record the data. Section 5 describes the
Datasets BD1 BD2 BD3 BD4 BD5
Airframe type Quad SY130 Hexa SY300 Quad SY-MAV Quad SY Quad SY250 database structure. Finally, section 6 presents the
Quantity of models 1 1 1 1 1
Sequences 186 11 1 4 27 most relevant conclusions of the work.
Indoor/sensors IMU/OMS IMU/OMS NO NO IMU/OMS
Outdoor/sensors GPS NO GPS/IMU GPS/IMU GPS/IMU
Video/image capture Yes Yes Yes Yes YES
Size room 11, 0X11, 0 M 2 1, 5x1, 0 m2 Urban place Outdoor 3, 0x1.5 m2

Table 1 shows multiple databases. These have sev-


eral discerning features. The Blackbird database

©2022 The Authors. This is the Author Accepted Manuscript issued with:
Creative Commons Attribution Non-Commercial ShareAlike License (CC:BY:NC:SA 4.0).
Please refer to any applicable publisher terms of use.
2 Sensor systems for 3D time images ought robust and high-quality
positioning and orientation of cameras.
drones. • Data from acoustical systems (UMS) has two
pieces: The receiver is on board the vehi-
An autonomous aerial vehicle can: Plan its flight cle. The transmitter is fixed at any point in
path. Handle it later without human action. It the navigation area. It defines the object’s
must act under clear safety rules [10]. In addi- location via ultrasonic waves [19, 63]. The
tion, it must guide itself in some instances. It waves travel through the air until they find
means, on trajectories, it must be able to: Detect the transmitter.
objects. Avoid likely collisions. Recalculate them
when doing so and assess them in the flight plan • Data from systems combining optical and
[4]. In this way, mixing info from multiple sen- electronic sensors (OMS) have two pieces.
sors is crucial. That is to mix the data between The cameras are in the flight arena. The
sensors to estimate their states constantly[21, 69]. marker is on board the vehicle. In addition,
In addition, this sensor fusion is vital for training they are coated with luminescent textiles. In
vehicle control and aircraft guidance [34, 61]. this way, the cameras can catch the light.
Two cameras are needed to rebuild the vehi-
The databases hold info from the Global Position- cle’s location [20, 25]. The number of cameras
ing System (GNSS). It occurs when flights have defines the trustworthiness of the data. Also,
been made in open space. In addition, they hold their height in the place and the light power
records from the inertial navigation system (IMU) inside it are essential factors [12, 31].
sensors. Images or videos on board attend these
flight tests to guide the aircraft [17, 29, 49]. They OMS systems are used in pressing cases. An ex-
also cover info from other kinds of sensors. For ex- ample of this is in places without GPS access
ample, when flights are made in GPS-denied areas. [1, 11, 40]. Also, in those where high dynamic
Also, laser or ultrasonic sensors are included to motion is the guide of research [14, 60, 67]. Like-
detect objects or markers around the flight space wise, it offers high measurement accuracy due to
[46]. Some of the sensors used are listed below: the fast dynamics of racing drones [37, 50].

• The Global Positioning Systems (GNSS) send


3 Configuration of measurement
radio signals (EMS) [45]. It estimates the
time it takes for the wave to reach an open re- systems for the TRAM-FPV
ceiver. Thus, it is doable to define a position RACING database.
of an object [42, 45, 72]. However, the reli-
ability of the data is faked by some factors. Three vital factors for a proper flight series: The
Signal noise affects the precision of the read- test room must be prepared. The related mea-
ing [27, 35, 52]. NAVSTAR-GPS, GLONASS, suring tools must be calibrated. In addition, the
IRNSS, GALILEO and BEI-DUO are sam- drone models must fit the sizes of the flight arena.
ples of these systems.
• The inertial navigation systems (INS) are on
board the vehicle. It uses inertial units or sen-
sors (IMU) to report angles rate [16, 53, 59].
Also, it reports some forces. In addition,
it could be reported body position. An ac-
celerometer forms it to compute the change
in speeds. Also, it has gyroscopes to define
object orientation. In addition, it has mag-
netometers to specify the strength of the sig-
nals. However, they must be fused with other
algorithms to resolve the position of a body.
• Data from image processing systems (IMS)
are on board the vehicle. It uses cameras to
provide the position of the object. It can also
predict the orientation. The sensors sense
Figure 1: Flight Arena. Cranfield University
both motions through a series of filtered im-
ages. Thus, they are handled in digital form
using a mix of methods [5, 28, 73]. Thus, real- The test room is the Flight Arena at Cranfield
University in the UK. The plan dimensions of the and the test values match the reference values of
arena are shown in figure 1, with a maximum the standard. In that case, the Tracker software
height of 10 m throughout the enclosure. On the can capture data at 41993 FPS with an accuracy
other hand, the flight arena has 30 Vicon cameras of 0.017 mm. In addition, it sets the accordance
[75]. The set of cameras is located at a height of between different test results acquired by the stan-
10 meters. In addition, they are 1.5 metres apart dard test method. They must be under defined
from each other. The data are transmitted via states. In this way, it levies the performance of
Ethernet. This way, the software tracker uses the optical tracking systems. These systems gauge six
TCP/IP communication protocol [74]. degrees of freedom of position and orientation.
The relative error between the position of the cam-
3.1 Description and configuration of the eras and the origin of the effective flight area relies
flight arena. on two factors. The first factor is the calibration
process. It is done by catching the light by moving
a rod in front of the camera. The second factor is
the intensity of the ambient light. A space with-
out reflective lights and darker is preferred. Based
on these factors, an error of 0.1 millimetres is suf-
ficient for each axe (X, Y, Z).

3.2 Description and configuration of the


racing drones used.

In this database, five kinds of racing drones have


been used. The main distinction between them
Figure 2: Cameras Vicon. Vantage and Vero
is their geometric shape. Thus, they develop dy-
namic behaviours based on their shape [9]. It is
The cameras are Vicon Vantage and Vero (see fig called a symmetric (SY) airframe, non-symmetric
2). They can capture motion between 250 and (NSY) or hybrid (HS).
1070 FPS. Also, the field of view is around 40
and 57 degrees. On the other hand, the resolution
ranges between 1.3 and 5.0 megapixels. It depends
on the volume calibration, the proper flight area
and the number of frames per second needed for
the flight test.

Figure 4: Kinds of airframe for racing drones

In figure 4, the SY airframe has angular distances


equal to 90 degrees. In addition, the wheelbase of
210 and 250 mm. The NSY airframe has a range
of 80 and 65 degrees. Also, a wheelbase of around
210 and 230 mm. The HS airframe has an angu-
lar distance between the upper arms equal to 80
degrees and lower arms of 90, while the wheelbase
is 250 mm.

Table 2: Component Descriptions


Components Description
Airframe geometry SY, NSY, HS
ESC 55 mA - Tmotor
Flight controller F7 - Tmotor
Video transmitter VTX Viva FPV - Tbs
Figure 3: Effective flight area for the flight test. Radio receiver R-XSR - FrSKY
Antennas Linear Emax
Battery 6s - 4s
Figure 3 shows the proper flight area. It is after a Propellers 5147 - Tmotor
successful calibration. The ASTM E3064 relates Motors - Tmotor F60PRO 1950-2550 Kv
Firmware Betaflight
to the ability of the cameras to process the images.
They are without filtering mainly. Also, without
post-processing the data. Suppose this is the case On the other hand, all racing drones were fitted
with the same electronic parts, motor group and Figure 6 also shows the position of the markers.
power supply, as shown in Table 2. In addition, Each ball has 14 mm in diameter. Also, they are
the control gain values are the same for all air- dressed in fluorescent textiles. In addition, they
frames. Also, the stability control aids were left are placed in non-symmetrical locations with a
as default[9]. gap of 10 mm. It is so that the OMS system can
more quickly rebuild the position of the markers
during motion.

3.3 Control scheme of the drones used

TRAM-FPV Racing are integrated between the


control levels. It is to merge with the other sensor
readings. Thus, it could be to train the motions of
racing drones. Also, to test a motion in confined
spaces under certain safety conditions. [23, 32,
36].
Figure 5: Hybrid structure - HS.

Figure 6: Symmetrical Structure - SY.

Figure 8: Alternatives with Vicon system

Figure 8 shows a basic control scheme of two rac-


ing drones. They are in a safe area with a Vicon
control system. It shows how the OMS data re-
place the GPS data for autonomous navigation. In
addition, the blue arrows show this data’s feasible
relations in the control loop. Mainly, the green
Figure 7: Non-Symmetrical Structure - NSY.
ones are to detect or avoid collisions. In the case
of racing drones, they could pass through obsta-
Figures 5, 6 and 7 show the kinds of drones used
cles and evade them.
for the flight test. It should be noted that the geo-
metric settings of these model racing drones were On the other hand, using direct data from OMS is
written on the firmware of each flight controller. regular use [51, 70, 71]. Some of the algorithms are
In addition, the settings linked to the travel of to manage obstacles. Other uses take into acound
the radio-control levers were set to their default precise and simultaneous localisation. The most
values. classic can be SLAM, LiDAR or odometry [39,
55, 56]. Other kinds include tagging and dragging
images. They are preloaded in databases to place
the motion of objects. The latest progress has to
do with the machine learning app.

4 Flight sequences.

There are three essential parts before starting the


flight tests. The first is to adapt the sampling fre-
quencies of the Vicon camera. Secondly, the IMU
has to be also set. It is preferred that those set-
tings are according to the flight arena (see figure
3)). Finally, relative size errors are vital to be
known.
In the case of the Vicon cameras, the flight se-
quences were captured at 250 FPS. In contrast,
the IMU was performed at 500 Hz. This calibra-
tion of the cams was performed every ten flights. Figure 10: Distances and trajectories covered.
Also, the calibration error of less than 0.1% was
allowed. Thus, flight sequences were synchronised jectories performed by the racing drone. They
with video recordings for each test. must be positioned in such a way that they cover
the dead spots of the turns. Thus, at least three of
them could detect a marker. This way, rotations
on turns are captured. These special cares are due
to the ends of the trajectory.

5 Structure of the TRAM-FPV


dataset.

The database is kept in a storage repository at


Cranfield University. It is open and can be ac-
cessed through the bibliographic link [76].

Figure 9: Software - Vicon Tracker.

The Tracker software (see Figure 9) matched the


frame systems. It is the object origin with a frame
system of the flight arena. The origin of the object
is according to NED coordinates. So, NED has to
match the frame system of the cameras.
Recording of the data can start after the calibra-
tion of the flight area. It depends on the light
conditions. Also, the camera positions are essen-
tial to catch the motion. It is ready when the
software tracker records acceptable error margins.
Later, the pilot will start the IMU sensors of the
drone to start the flight test.
Figure 11: TRAM-FPV files.
Each test lasted between 2.5 and 3.0 minutes of
flight time. 30 tests were performed for each drone
The TRAM-FPV Racing database consists of
used, for a total of 150 tests, equating to a range
three folders. These have been labelled according
between 75 and 90 hours of flight time for each
to the geometry of the racing drones (SY, NSY
drone used. These data are stored in the TRAM-
and HS). In addition, a fourth folder has been in-
FPV database.
cluded. It is about each model’s mass distribu-
All cameras pointed(see figure 10) toward the tra- tions and moments of inertia. It is also tagged
into three sub-folders according to the names of These kinds can be transformed into any other
the airframes (see figure 11). type of non-instantaneous rotation, such as Euler
or quaternion terms. The errors in the table are
Within the SY, NSY and HS folders, there are
ratio coefficients of variations. The tracker soft-
three subfolders. These have been named test1,
ware gives them. For more exact measures, please
test2 and test3. However, the SY folder has been
consult [48].
added to a fourth 4 test. Thus, an extra sub-
folder (test4) will be found there. Also, within
each test subfolder, there are three files: a video
file in WEBM format and two excel - CSV files.
The CSV files labelled with the battery number
(from zero to nine) are the data from the OMS
system. So the CSV data from the IMU is also
called by the battery number plus the acronym
bbl.

Table 3: Dataset - IMU


Row Description Magnitude Error (%)
1 loopIteration < 1.284.656 Figure 12: IMU readings synchronized with video
2 Local Time µs
3 Roll axis rotation deg/s < 0, 01 Figure 12 shows the WEBM video files. In-flight
4 Pitch axis rotation deg/s < 0, 01 on the screen, it is feasible to watch the motion
5 Yaw axis rotation deg/s < 0, 01 of the gyroscopes at the top. Also, the motion of
6 X-axis acceleration raw < 0, 1 accelerometers is at the bottom. The video im-
7 Y-axis acceleration raw < 0, 1 ages have been matched with the behaviour of the
8 Z-axis acceleration raw < 0, 1
sensors. Thus, varied sorts of comparisons are pos-
9 Roll-Heading raw < 0, 09
10 Pitch-Heading raw < 0, 09 sible.
11 Yaw-Heading raw < 0, 09
6 Conclusions
The CSV-IMU files hold 11 columns by 90.000
rows. They are sorted as in table 3. Mainly, it This paper presents the TRAM-FPV racing
describes three rotations, three accelerations and database. It stores dynamic flight data for the
3 heading angles. The frame reference for the mo- study of racing drones. In this way, the position
tion is X, Y, and Z. The data values of the ac- and rotation motion is in it. Also, it holds the
celerations and the heading angle are raw values mass distribution of the models. This mix of data
(RAW). Also, they are based on the stick’s travel. makes it unique (see table 5). It incorporates 30
The equivalences are: 2048 units of acceleration is flight sequences for each model used. It means a
equal to one unit of gravity (1g). In addition, the total of 150 flight tests.
data is smoothed by a low-pass filter. Thus, one
unit of Heading is equal to 58.1 degrees. Table 5: Database features
Base de datos TRAM-FPV Racing
Kind of airframe SY, NSY, HS
Table 4: Dataset OMS-Vicon
Row Description Magnitude Error (%) Quantity of models 5
1 Frames fps < 0, 017 Flight sequences 150
2 Subframes 0 NA Indoor/sensors IMU/OMS
3 RX X-axis rotation rad 0, 397 − 0, 79 Outdoor/sensors NO
4 RY Y-axis rotation rad 0, 397 − 0, 79
Video cam/image Yes
5 RZ Z-axis rotation rad 0, 397 − 0, 79
6 TX X-axis translation mm < 0, 149 Rffective flight area. 20x20 meters
7 TY Y-axis translation mm < 0, 149
8 TZ Z-axis translation mm < 0, 149 The database holds details on five kinds of racing
drones. This range of choices makes it the only
The CSV-Vicon files hold eight columns by 50000 one of its class. It is made to study the motion
rows. They are sorted as shown in table 4. Mainly, of racing drones. However, other types of studies
it has three rotations and translations. Also, it could be done with it. The aerodynamic models
has the rate of data capture or FPS. The rota- based on the database are highly relevant. Also,
tion order is helical. It means that the rotation studying cross-modal performance by analysing
is relative to the position of the marker at various big data is an exciting topic. Specific uses like
time instants. It is also called roto-translation. machine learning training are feasible yet.
On the other hand, the database aims to continue monitoring systems. A method based on per-
the growing interest in designing sensors. Mainly ception control of depth. In 2013 Signal Pro-
for the field of autonomous racing drones. They cessing: Algorithms, Architectures, Arrange-
will have to sense the usual motion of a radio- ments, and Applications (SPA) (pp. 226-230).
controlled racing drone. IEEE.
[6] Bigazzi, L., Basso, M., Boni, E., Innocenti,
Acknowledgement G., and Pieraccini, M. (2021). A Multilevel
Architecture for Autonomous UAVs. drones,
This work has been partially carried out thanks to 5(3), 55.
the support of the project PID2020-119468RA-I00
funded by MCIN/AEI/10.13039/501100011033. [7] Burri, M., Nikolic, J., Gohl, P., Schneider,
We would also like to thank the Erasmus intern- T., Rehder, J., Omari, S., ... and Siegwart,
ship programme for the financial support provided R. (2016). The EuRoC micro aerial vehi-
through the Doctoral School of the Universitat cle datasets. The International Journal of
Politècnica de València and the programme for in- Robotics Research, 35(10), 1157-1163.
ternational exchange (OPII-UPV).
[8] Caesar, H., Bankiti, V., Lang, A. H., Vora,
Special thanks to Cranfield University (UK) and S., Liong, V. E., Xu, Q., ... and Beijbom, O.
the Centre for Autonomous and Cyberphysical (2020). nuscenes: A multimodal dataset for
Systems at the School of Aerospace, Transport autonomous driving. In Proceedings of the
and Manufacturing (SATM) for the technical sup- IEEE/CVF conference on computer vision
port received and for facilitating the use of their and pattern recognition (pp. 11621-11631).
facilities and Vicon flight arena measurement
equipment [9] Castiblanco, J. M., Garcia-Nieto, S., Simarro,
R., and Salcedo, J. V. (2021). Experimental
This paper was translated into Spanish for study on the dynamic behaviour of drones
the XLIII Automática 2022 conference held designed for racing competitions. Interna-
at the University of La Coruña - Spain, on tional Journal of Micro Air Vehicles, 13,
September 7, 8 and 9. With doi number 17568293211005757.
doi.org/10.17979/spudc.9788497498418.0341
[10] Chao, H., Cao, Y., and Chen, Y. (2010). Au-
topilots for small unmanned aerial vehicles: a
References survey. International Journal of Control, Au-
tomation and Systems, 8(1), 36-44.
[1] Abosekeen, A., Iqbal, U., Noureldin, A., and
Korenberg, M. J. (2020). A novel multi-level [11] Chen, C., Tian, Y., Lin, L., Chen, S., Li,
integrated navigation system for challeng- H., Wang, Y., and Su, K. (2020). Obtain-
ing GNSS environments. IEEE Transactions ing world coordinate information of UAV in
on Intelligent Transportation Systems, 22(8), GNSS denied environments. Sensors, 20(8),
4838-4852. 2241.
[2] Antonini, A., Guerra, W., Murali, V., Sayre- [12] Chen, L., Takashima, K., Fujita, K., and Ki-
McCord, T., and Karaman, S. (2018, Novem- tamura, Y. (2021, May). PinpointFly: An
ber). The blackbird dataset: A large-scale Egocentric Position-control Drone Interface
dataset for uav perception in aggressive flight. using Mobile AR. In Proceedings of the 2021
In International Symposium on Experimental CHI Conference on Human Factors in Com-
Robotics (pp. 130-139). Springer, Cham. puting Systems (pp. 1-13).
[3] Aurand, A. M., Dufour, J. S., and Marras, [13] Conte, C., de Alteriis, G., Schiano Lo
W. S. (2017). Accuracy map of an optical Moriello, R., Accardo, D., and Rufino, G.
motion capture system with 42 or 21 cam- (2021). Drone Trajectory Segmentation for
eras in a large measurement volume. Journal Real-Time and Adaptive Time-Of-Flight Pre-
of biomechanics, 58, 237-240. diction. Drones, 5(3), 62.
[4] Bagnell, J. A., Bradley, D., Silver, D., Sof- [14] Cyganek, B., and Wozniak, M. (2018). Vir-
man, B., and Stentz, A. (2010). Learning for tual high dynamic range imaging for under-
autonomous navigation. IEEE Robotics and water drone navigation. In ICIAE2018 the
Automation Magazine, 17(2), 74-84. 6th IIAE Int. Conf. Ind. Appl. Eng.
[5] Balcerek, J., Dabrowski, A., and Konieczka, [15] Delmerico, J., Cieslewski, T., Rebecq, H.,
A. (2013, September). Stereovision option for Faessler, M., and Scaramuzza, D. (2019,
May). Are we ready for autonomous drone [25] Garcia, J. A. B., and Younes, A. B. (2021).
racing? the UZH-FPV drone racing dataset. Real-Time Navigation for Drogue-Type Au-
In 2019 International Conference on Robotics tonomous Aerial Refueling Using Vision-
and Automation (ICRA) (pp. 6713-6719). Based Deep Learning Detection. Transac-
IEEE. tions on Aerospace and Electronic Systems,
57(4), 2225-2246. IEEE.
[16] de Figueiredo, R. P., Hansen, J. G., Fevre, J.
L., Brandao, M., and Kayacan, E. (2021). On [26] Geyer, J., Kassahun, Y., Mahmudi, M., Ri-
the advantages of multiple stereo vision cam- cou, X., Durgesh, R., Chung, A. S., ...
era designs for autonomous drone navigation. and Schuberth, P. (2020). A2d2: Audi au-
arXiv preprint arXiv:2105.12691. tonomous driving dataset. arXiv preprint
arXiv:2004.06320.
[17] Donati, C., Mammarella, M., Comba, L.,
Biglia, A., Gay, P., and Dabbene, F. (2022). [27] Hashim, H. A. (2021, May). Gps-denied
3D Distance Filter for the Autonomous Navi- navigation: Attitude, position, linear ve-
gation of UAVs in Agricultural Scenarios. Re- locity, and gravity estimation with nonlin-
mote Sensing, 14(6), 1374. ear stochastic observer. In 2021 American
[18] Eichelberger, P., Ferraro, M., Minder, U., Control Conference (ACC) (pp. 1149-1154).
Denton, T., Blasimann, A., Krause, F., and IEEE.
Baur, H. (2016). Analysis of accuracy in op- [28] Hayat, S., Jung, R., Hellwagner, H., Bettstet-
tical motion capture–A protocol for labora- ter, C., Emini, D., and Schnieders, D. (2021).
tory setup evaluation. Journal of biomechan- Edge computing in 5G for drone navigation:
ics, 49(10), 2085-2088. What to offload?. IEEE Robotics and Au-
[19] Famili, A., and Park, J. M. J. (2020, May). tomation Letters, 6(2), 2571-2578.
Rolatin: Robust localization and tracking for
[29] He, D., Qiao, Y., Chan, S., and Guizani, N.
indoor navigation of drones. In 2020 IEEE
(2018). Flight security and safety of drones in
Wireless Communications and Networking
airborne fog computing systems. IEEE Com-
Conference (WCNC) (pp. 1-6). IEEE.
munications Magazine, 56(5), 66-71.
[20] Farid, A., Veer, S., and Majumdar, A. (2022,
[30] Huang, X., Cheng, X., Geng, Q., Cao, B.,
January). Task-driven out-of-distribution de-
Zhou, D., Wang, P., ... and Yang, R. (2018).
tection with statistical guarantees for robot
The apolloscape dataset for autonomous
learning. In Conference on Robot Learning
driving. In Proceedings of the IEEE confer-
(pp. 970-980). PMLR.
ence on computer vision and pattern recogni-
[21] Florea, A. G., and Buiu, C. (2019, May). tion workshops (pp. 954-960).
Sensor fusion for autonomous drone waypoint
navigation using ROS and numerical P sys- [31] Huppert, F., Hoelzl, G., and Kranz, M.
tems: A critical analysis of its advantages and (2021, May). GuideCopter-A precise drone-
limitations. In 2019 22nd International Con- based haptic guidance interface for blind or
ference on Control Systems and Computer visually impaired people. In Proceedings of
Science (CSCS) (pp. 112-117). IEEE. the 2021 CHI Conference on Human Factors
in Computing Systems (pp. 1-14).
[22] Foehn, P., Romero, A., and Scaramuzza, D.
(2021). Time-optimal planning for quadro- [32] Jiang, P., Osteen, P., Wigness, M., and Sari-
tor waypoint flight. Science Robotics, 6(56), palli, S. (2021, May). Rellis-3d dataset: Data,
eabh1221. benchmarks and analysis. In 2021 IEEE inter-
national conference on robotics and automa-
[23] Foroughi, F., Chen, Z., and Wang, J. (2021). tion (ICRA) (pp. 1110-1116). IEEE.
A cnn-based system for mobile robot naviga-
tion in indoor environments via visual local- [33] Johansen, T. A., Fossen, T. I., and Berge,
ization with a small dataset. World Electric S. P. (2004). Constrained nonlinear con-
Vehicle Journal, 12(3), 134. trol allocation with singularity avoidance us-
ing sequential quadratic programming. IEEE
[24] Furtado, J. S., Liu, H. H., Lai, G., Lacheray, Transactions on Control Systems Technology,
H., and Desouza-Coelho, J. (2019). Compara- 12(1), 211-216.
tive analysis of optitrack motion capture sys-
tems. In Advances in Motion Sensing and [34] Jung, S., Hwang, S., Shin, H., and Shim, D.
Control for Robotic Applications (pp. 15-31). H. (2018). Perception, guidance, and naviga-
Springer, Cham. tion for indoor autonomous drone racing us-
ing deep learning. IEEE Robotics and Au- urban firefighting. Applied Sciences, 11(3),
tomation Letters, 3(3), 2539-2544. (pp1258).

[35] Karetnikov, V., Milyakov, D., Prokhorenkov, [44] Majdik, A. L., Till, C., and Scaramuzza, D.
A., and Ol’khovik, E. (2021). Prospects of (2017). The Zurich urban micro aerial ve-
application of mass-produced GNSS modules hicle dataset. The International Journal of
for solving high-precision navigation tasks. In Robotics Research, 36(3), 269-273.
E3S Web of Conferences (Vol. 244, p. 08006).
[45] Mangialardo, M., Jurado, M. M., Hagan,
EDP Sciences.
D., Giordano, P., and Ventura-Traveset, J.
[36] Karnan, H., Nair, A., Xiao, X., Warnell, (2021, September). The full Potential of an
G., Pirk, S., Toshev, A., ... and Stone, Autonomous GNSS Signalbased Navigation
P. (2022). Socially Compliant Navigation System for Moon Missions. In Proceedings of
Dataset (SCAND): A Large-Scale Dataset of the 34th International Technical Meeting of
Demonstrations for Social Navigation. arXiv the Satellite Division of The Institute of Nav-
preprint arXiv:2203.15041. igation (ION GNSS+ 2021) (pp. 1039-1052).

[37] Kaufmann, E., Gehrig, M., Foehn, P., Ran- [46] McGuire, K., De Croon, G., De Wagter, C.,
ftl, R., Dosovitskiy, A., Koltun, V., and Tuyls, K., and Kappen, H. (2017). Efficient
Scaramuzza, D. (2019, May). Beauty and the optical flow and stereo vision for velocity es-
beast: Optimal methods meet learning for timation and obstacle avoidance on an au-
drone racing. In 2019 International Confer- tonomous pocket drone. IEEE Robotics and
ence on Robotics and Automation (ICRA) Automation Letters, 2(2), 1070-1076.
(pp. 690-696). IEEE. [47] Mellinger, D., and Kumar, V. (2011, May).
Minimum snap trajectory generation and
[38] Kaufmann, E., Loquercio, A., Ranftl, R.,
control for quadrotors. In 2011 IEEE interna-
Dosovitskiy, A., Koltun, V., and Scara-
tional conference on robotics and automation
muzza, D. (2018, October). Deep drone rac-
(pp. 2520-2525). IEEE.
ing: Learning agile flight in dynamic environ-
ments. In Conference on Robot Learning (pp. [48] Merriaux, P., Dupuis, Y., Boutteau, R.,
133-145). PMLR. Vasseur, P., and Savatier, X. (2017). A
study of vicon system positioning perfor-
[39] Kazim, M., Zaidi, A., Ali, S., Raza, M. T.,
mance. Sensors, 17(7), 1591.
Abbas, G., Ullah, N., and Al-Ahmadi, A. A.
(2022). Perception Action Aware-Based Au- [49] Miranda, V. R., Rezende, A., Rocha, T. L.,
tonomous Drone Race in a Photorealistic En- Azpúrua, H., Pimenta, L. C., and Freitas,
vironment. IEEE Access, 10, 42566-42576. G. M. (2022). Autonomous navigation system
for a delivery drone. Journal of Control, Au-
[40] Kim, J., and Sukkarieh, S. (2005). 6DoF tomation and Electrical Systems, 33(1), 141-
SLAM aided GNSS/INS navigation in GNSS 155.
denied and unknown environments. Position-
ing, 1(09). [50] Moon, H., Martinez-Carranza, J., Cieslewski,
T., Faessler, M., Falanga, D., Simovic, A., ...
[41] Loquercio, A., Kaufmann, E., Ranftl, R., and Kim, S. J. (2019). Challenges and im-
Dosovitskiy, A., Koltun, V., and Scaramuzza, plemented technologies used in autonomous
D. (2019). Deep drone racing: From simula- drone racing. Intelligent Service Robotics,
tion to reality with domain randomization. 12(2), 137-148.
IEEE Transactions on Robotics, 36(1), 1-14.
[51] Minoda, K., Schilling, F., Wüest, V., Flore-
[42] Lin, X., Guo, J., Li, X., Tang, C., Shao, ano, D., and Yairi, T. (2021). Viode: A sim-
R., Pan, J., ... and Li, Z. (2022). Applica- ulated dataset to address the challenges of
tions and Prospects for Autonomous Navi- visual-inertial odometry in dynamic environ-
gation Technology in a Satellite Navigation ments. IEEE Robotics and Automation Let-
System. In China Satellite Navigation Con- ters, 6(2), 1343-1350.
ference (CSNC 2022) Proceedings: Volume
II (p. 332). Springer Nature. [52] Nezhadshahbodaghi, M., Mosavi, M. R.,
and Hajialinajar, M. T. (2021). Fusing de-
[43] Madridano, A., Al-Kaff, A., Flores, P., Mar- noised stereo visual odometry, INS and GPS
tin, D., and de la Escalera, A. (2021). Soft- measurements for autonomous navigation in
ware architecture for autonomous and coordi- a tightly coupled approach. Gps Solutions,
nated navigation of uav swarms in forest and 25(2), 1-18.
[53] Petritoli, E., Leccese, F., and Spagnolo, G. [62] Song, Y., Steinweg, M., Kaufmann, E.,
S. (2020, June). Inertial Navigation Systems and Scaramuzza, D. (2021, January). Au-
(INS) for Drones: Position Errors Model. In tonomous drone racing with deep reinforce-
2020 IEEE 7th International Workshop on ment learning. In 2021 IEEE/RSJ Interna-
Metrology for AeroSpace (MetroAeroSpace) tional Conference on Intelligent Robots and
(pp. 500-504). IEEE. Systems (IROS) (pp. 1205-1212). IEEE.

[54] Patoliya, J., Mewada, H., Hassaballah, M., [63] Šoštarić, D., and Mester, G. (2020). Drone
Khan, M. A., and Kadry, S. (2022). A robust localization using ultrasonic TDOA and RSS
autonomous navigation and mapping system signal: Integration of the inverse method of a
based on GPS and LiDAR data for uncon- particle filter. FME Transactions, 48(1), 21-
straint environment. Earth Science Informat- 30.
ics, 1-13.
[64] Spedicato, S., Notarstefano, G., Bülthoff,
H. H., and Franchi, A. (2016). Aggressive
[55] Pfeiffer, C., Wengeler, S., Loquercio, A.,
maneuver regulation of a quadrotor UAV.
and Scaramuzza, D. (2022). Visual atten-
In Robotics Research (pp. 95-112). Springer,
tion prediction improves performance of au-
Cham.
tonomous drone racing agents. Plos one,
17(3), e0264471. [65] Spedicato, S., and Notarstefano, G. (2017).
Minimum-time trajectory generation for
[56] Pham, H. X., Ugurlu, H. I., Le Fevre, J., quadrotors in constrained environments.
Bardakci, D., and Kayacan, E. (2022). Deep IEEE Transactions on Control Systems Tech-
learning for vision-based navigation in au- nology, 26(4), 1335-1344.
tonomous drone racing. In Deep Learning for
Robot Perception and Cognition (pp. 371- [66] Srigrarom, S., Chew, K. H., Da Lee, D. M.,
406). Academic Press. and Ratsamee, P. (2020, September). Drone
versus bird flights: Classification by trajec-
[57] Reyes-Munoz, J. A., and Flores-Abad, A. tories characterization. In 2020 59th Annual
(2021, August). A MAV Platform for In- Conference of the Society of Instrument and
doors and Outdoors Autonomous Navigation Control Engineers of Japan (SICE) (pp. 343-
in GPS-denied Environments. In 2021 IEEE 348). IEEE.
17th International Conference on Automa-
tion Science and Engineering (CASE) (pp. [67] Stepanyan, V., Krishnakumar, K. S., and
1708-1713). IEEE. Bencomo, A. (2016). Identification and re-
configurable control of impaired multi-rotor
[58] Rojas-Perez, L. O., and Martı́nez-Carranza, drones. In AIAA Guidance, Navigation, and
J. (2021). On-board processing for au- Control Conference (p. 1384).
tonomous drone racing: an overview. Inte-
gration, 80, 46-59. [68] Sun, K., Mohta, K., Pfrommer, B., Watter-
son, M., Liu, S., Mulgaonkar, Y., ... and Ku-
[59] Sani, M. F., and Karimian, G. (2017, Novem- mar, V. (2018). Robust stereo visual inertial
ber). Automatic navigation and landing of odometry for fast autonomous flight. IEEE
an indoor AR. drone quadrotor using ArUco Robotics and Automation Letters, 3(2), 965-
marker and inertial sensors. In 2017 interna- 972.
tional conference on computer and drone ap-
[69] Vanhie-Van Gerwen, J., Geebelen, K., Wan,
plications (IConDA) (pp. 102-107). IEEE.
J., Joseph, W., Hoebeke, J., and De Poorter,
E. (2021). Indoor Drone Positioning: Accu-
[60] Shafiee, M., Zhou, Z., Mei, L., Dinmoham-
racy and Cost Trade-Off for Sensor Fusion.
madi, F., Karama, J., and Flynn, D. (2021).
IEEE Transactions on Vehicular Technology,
Unmanned aerial drones for inspection of off-
71(1), 961-974.
shore wind turbines: A mission-critical fail-
ure analysis. Robotics, 10(1), 26. [70] Wang, C., Wang, Y., Xu, M., and Crandall,
D. J. (2022). Stepwise goal-driven networks
[61] Song, Y., Steinweg, M., Kaufmann, E., for trajectory prediction. IEEE Robotics and
and Scaramuzza, D. (2021, January). Au- Automation Letters, 7(2), 2716-2723.
tonomous drone racing with deep reinforce-
ment learning. In 2021 IEEE/RSJ Interna- [71] Yao, Y., Atkins, E., Johnson-Roberson, M.,
tional Conference on Intelligent Robots and Vasudevan, R., and Du, X. (2021). Bi-
Systems (IROS) (pp. 1205-1212). IEEE. trap: Bi-directional pedestrian trajectory
prediction with multi-modal goal estima-
tion. IEEE Robotics and Automation Letters,
6(2), 1463-1470.
[72] Yayla, G., Van Baelen, S., Peeters, G., Afzal,
M. R., Singh, Y., and Slaets, P. (2021). Ac-
curacy benchmark of Galileo and EGNOS for
Inland Waterways. In Proceedings of the In-
ternational Ship Control Systems Symposium
(iSCSS) (pp. 1-10). Zenodo.
[73] Yue, Z. (2018). Dynamic Network Recon-
struction in Systems Biology: Methods and
Algorithms (Doctoral dissertation, Univer-
sity of Luxembourg, Luxembourg).
[74] Vicon Company. Tracker system - User
guide. (2019, December). Software. From
URL: https://www.vicon.com/cms/wp-
content/uploads/2019/08/tracker-3-
11042017-55587.pdf
[75] Vicon Company. Cameras - Technical specifi-
cations. (2020, June). Hardware. From URL:
https://www.vicon.com/hardware/cameras/
[76] Castiblanco, J.M., Garcı́a-Nieto,
S., Ignatyev, D., Blasco, X.,
(2022, Mayo 27) From URL:
http://figshare.com/s/24642072abc29b8f1535
[77] Castiblanco Quintero, J. M., Garcia-Nieto,
S., Ignatyev, D., and Blasco, X. (2022). The
TRAM-FPV RACING Open Database. Se-
quences complete indoor flight sequences for
the study of racing drones. In XLIII Jornadas
de Automática (pp. 341-352). Universidade
da Coruña. Servizo de Publicacións.

© 2022 by the authors.


Submitted for possible
open access publication
under the terms and conditions of the Cre-
ative Commons Attribution CC-BY-NC-SA 4.0
license (https://creativecommons.org/licenses/by-nc-
sa/4.0/deed.es).
Cranfield University
CERES https://dspace.lib.cranfield.ac.uk
School of Aerospace, Transport and Manufacturing (SATM) Staff publications (SATM)

2022-09-09

THE TRAM-FPV RACING Open


Database. Sequences complete indoor
flight tests for the study of racing drones

Castiblanco, J. M.
University of Rioja

Castiblanco JM, Garcia-Nieto S, Ignatyev D, Blasco X. (2022) THE TRAM-FPV RACING Open
Database. Sequences complete indoor flight tests for the study of racing drones. In: XLIII
Automatica 2022, 7-9 September 2022, Logrono, La Rioja, Spain
https://dspace.lib.cranfield.ac.uk/handle/1826/18343
Downloaded from Cranfield Library Services E-Repository

You might also like