Flight Tests For The Study of Racing Drones-2022
Flight Tests For The Study of Racing Drones-2022
Flight Tests For The Study of Racing Drones-2022
©2022 The Authors. This is the Author Accepted Manuscript issued with:
Creative Commons Attribution Non-Commercial ShareAlike License (CC:BY:NC:SA 4.0).
Please refer to any applicable publisher terms of use.
2 Sensor systems for 3D time images ought robust and high-quality
positioning and orientation of cameras.
drones. • Data from acoustical systems (UMS) has two
pieces: The receiver is on board the vehi-
An autonomous aerial vehicle can: Plan its flight cle. The transmitter is fixed at any point in
path. Handle it later without human action. It the navigation area. It defines the object’s
must act under clear safety rules [10]. In addi- location via ultrasonic waves [19, 63]. The
tion, it must guide itself in some instances. It waves travel through the air until they find
means, on trajectories, it must be able to: Detect the transmitter.
objects. Avoid likely collisions. Recalculate them
when doing so and assess them in the flight plan • Data from systems combining optical and
[4]. In this way, mixing info from multiple sen- electronic sensors (OMS) have two pieces.
sors is crucial. That is to mix the data between The cameras are in the flight arena. The
sensors to estimate their states constantly[21, 69]. marker is on board the vehicle. In addition,
In addition, this sensor fusion is vital for training they are coated with luminescent textiles. In
vehicle control and aircraft guidance [34, 61]. this way, the cameras can catch the light.
Two cameras are needed to rebuild the vehi-
The databases hold info from the Global Position- cle’s location [20, 25]. The number of cameras
ing System (GNSS). It occurs when flights have defines the trustworthiness of the data. Also,
been made in open space. In addition, they hold their height in the place and the light power
records from the inertial navigation system (IMU) inside it are essential factors [12, 31].
sensors. Images or videos on board attend these
flight tests to guide the aircraft [17, 29, 49]. They OMS systems are used in pressing cases. An ex-
also cover info from other kinds of sensors. For ex- ample of this is in places without GPS access
ample, when flights are made in GPS-denied areas. [1, 11, 40]. Also, in those where high dynamic
Also, laser or ultrasonic sensors are included to motion is the guide of research [14, 60, 67]. Like-
detect objects or markers around the flight space wise, it offers high measurement accuracy due to
[46]. Some of the sensors used are listed below: the fast dynamics of racing drones [37, 50].
4 Flight sequences.
[35] Karetnikov, V., Milyakov, D., Prokhorenkov, [44] Majdik, A. L., Till, C., and Scaramuzza, D.
A., and Ol’khovik, E. (2021). Prospects of (2017). The Zurich urban micro aerial ve-
application of mass-produced GNSS modules hicle dataset. The International Journal of
for solving high-precision navigation tasks. In Robotics Research, 36(3), 269-273.
E3S Web of Conferences (Vol. 244, p. 08006).
[45] Mangialardo, M., Jurado, M. M., Hagan,
EDP Sciences.
D., Giordano, P., and Ventura-Traveset, J.
[36] Karnan, H., Nair, A., Xiao, X., Warnell, (2021, September). The full Potential of an
G., Pirk, S., Toshev, A., ... and Stone, Autonomous GNSS Signalbased Navigation
P. (2022). Socially Compliant Navigation System for Moon Missions. In Proceedings of
Dataset (SCAND): A Large-Scale Dataset of the 34th International Technical Meeting of
Demonstrations for Social Navigation. arXiv the Satellite Division of The Institute of Nav-
preprint arXiv:2203.15041. igation (ION GNSS+ 2021) (pp. 1039-1052).
[37] Kaufmann, E., Gehrig, M., Foehn, P., Ran- [46] McGuire, K., De Croon, G., De Wagter, C.,
ftl, R., Dosovitskiy, A., Koltun, V., and Tuyls, K., and Kappen, H. (2017). Efficient
Scaramuzza, D. (2019, May). Beauty and the optical flow and stereo vision for velocity es-
beast: Optimal methods meet learning for timation and obstacle avoidance on an au-
drone racing. In 2019 International Confer- tonomous pocket drone. IEEE Robotics and
ence on Robotics and Automation (ICRA) Automation Letters, 2(2), 1070-1076.
(pp. 690-696). IEEE. [47] Mellinger, D., and Kumar, V. (2011, May).
Minimum snap trajectory generation and
[38] Kaufmann, E., Loquercio, A., Ranftl, R.,
control for quadrotors. In 2011 IEEE interna-
Dosovitskiy, A., Koltun, V., and Scara-
tional conference on robotics and automation
muzza, D. (2018, October). Deep drone rac-
(pp. 2520-2525). IEEE.
ing: Learning agile flight in dynamic environ-
ments. In Conference on Robot Learning (pp. [48] Merriaux, P., Dupuis, Y., Boutteau, R.,
133-145). PMLR. Vasseur, P., and Savatier, X. (2017). A
study of vicon system positioning perfor-
[39] Kazim, M., Zaidi, A., Ali, S., Raza, M. T.,
mance. Sensors, 17(7), 1591.
Abbas, G., Ullah, N., and Al-Ahmadi, A. A.
(2022). Perception Action Aware-Based Au- [49] Miranda, V. R., Rezende, A., Rocha, T. L.,
tonomous Drone Race in a Photorealistic En- Azpúrua, H., Pimenta, L. C., and Freitas,
vironment. IEEE Access, 10, 42566-42576. G. M. (2022). Autonomous navigation system
for a delivery drone. Journal of Control, Au-
[40] Kim, J., and Sukkarieh, S. (2005). 6DoF tomation and Electrical Systems, 33(1), 141-
SLAM aided GNSS/INS navigation in GNSS 155.
denied and unknown environments. Position-
ing, 1(09). [50] Moon, H., Martinez-Carranza, J., Cieslewski,
T., Faessler, M., Falanga, D., Simovic, A., ...
[41] Loquercio, A., Kaufmann, E., Ranftl, R., and Kim, S. J. (2019). Challenges and im-
Dosovitskiy, A., Koltun, V., and Scaramuzza, plemented technologies used in autonomous
D. (2019). Deep drone racing: From simula- drone racing. Intelligent Service Robotics,
tion to reality with domain randomization. 12(2), 137-148.
IEEE Transactions on Robotics, 36(1), 1-14.
[51] Minoda, K., Schilling, F., Wüest, V., Flore-
[42] Lin, X., Guo, J., Li, X., Tang, C., Shao, ano, D., and Yairi, T. (2021). Viode: A sim-
R., Pan, J., ... and Li, Z. (2022). Applica- ulated dataset to address the challenges of
tions and Prospects for Autonomous Navi- visual-inertial odometry in dynamic environ-
gation Technology in a Satellite Navigation ments. IEEE Robotics and Automation Let-
System. In China Satellite Navigation Con- ters, 6(2), 1343-1350.
ference (CSNC 2022) Proceedings: Volume
II (p. 332). Springer Nature. [52] Nezhadshahbodaghi, M., Mosavi, M. R.,
and Hajialinajar, M. T. (2021). Fusing de-
[43] Madridano, A., Al-Kaff, A., Flores, P., Mar- noised stereo visual odometry, INS and GPS
tin, D., and de la Escalera, A. (2021). Soft- measurements for autonomous navigation in
ware architecture for autonomous and coordi- a tightly coupled approach. Gps Solutions,
nated navigation of uav swarms in forest and 25(2), 1-18.
[53] Petritoli, E., Leccese, F., and Spagnolo, G. [62] Song, Y., Steinweg, M., Kaufmann, E.,
S. (2020, June). Inertial Navigation Systems and Scaramuzza, D. (2021, January). Au-
(INS) for Drones: Position Errors Model. In tonomous drone racing with deep reinforce-
2020 IEEE 7th International Workshop on ment learning. In 2021 IEEE/RSJ Interna-
Metrology for AeroSpace (MetroAeroSpace) tional Conference on Intelligent Robots and
(pp. 500-504). IEEE. Systems (IROS) (pp. 1205-1212). IEEE.
[54] Patoliya, J., Mewada, H., Hassaballah, M., [63] Šoštarić, D., and Mester, G. (2020). Drone
Khan, M. A., and Kadry, S. (2022). A robust localization using ultrasonic TDOA and RSS
autonomous navigation and mapping system signal: Integration of the inverse method of a
based on GPS and LiDAR data for uncon- particle filter. FME Transactions, 48(1), 21-
straint environment. Earth Science Informat- 30.
ics, 1-13.
[64] Spedicato, S., Notarstefano, G., Bülthoff,
H. H., and Franchi, A. (2016). Aggressive
[55] Pfeiffer, C., Wengeler, S., Loquercio, A.,
maneuver regulation of a quadrotor UAV.
and Scaramuzza, D. (2022). Visual atten-
In Robotics Research (pp. 95-112). Springer,
tion prediction improves performance of au-
Cham.
tonomous drone racing agents. Plos one,
17(3), e0264471. [65] Spedicato, S., and Notarstefano, G. (2017).
Minimum-time trajectory generation for
[56] Pham, H. X., Ugurlu, H. I., Le Fevre, J., quadrotors in constrained environments.
Bardakci, D., and Kayacan, E. (2022). Deep IEEE Transactions on Control Systems Tech-
learning for vision-based navigation in au- nology, 26(4), 1335-1344.
tonomous drone racing. In Deep Learning for
Robot Perception and Cognition (pp. 371- [66] Srigrarom, S., Chew, K. H., Da Lee, D. M.,
406). Academic Press. and Ratsamee, P. (2020, September). Drone
versus bird flights: Classification by trajec-
[57] Reyes-Munoz, J. A., and Flores-Abad, A. tories characterization. In 2020 59th Annual
(2021, August). A MAV Platform for In- Conference of the Society of Instrument and
doors and Outdoors Autonomous Navigation Control Engineers of Japan (SICE) (pp. 343-
in GPS-denied Environments. In 2021 IEEE 348). IEEE.
17th International Conference on Automa-
tion Science and Engineering (CASE) (pp. [67] Stepanyan, V., Krishnakumar, K. S., and
1708-1713). IEEE. Bencomo, A. (2016). Identification and re-
configurable control of impaired multi-rotor
[58] Rojas-Perez, L. O., and Martı́nez-Carranza, drones. In AIAA Guidance, Navigation, and
J. (2021). On-board processing for au- Control Conference (p. 1384).
tonomous drone racing: an overview. Inte-
gration, 80, 46-59. [68] Sun, K., Mohta, K., Pfrommer, B., Watter-
son, M., Liu, S., Mulgaonkar, Y., ... and Ku-
[59] Sani, M. F., and Karimian, G. (2017, Novem- mar, V. (2018). Robust stereo visual inertial
ber). Automatic navigation and landing of odometry for fast autonomous flight. IEEE
an indoor AR. drone quadrotor using ArUco Robotics and Automation Letters, 3(2), 965-
marker and inertial sensors. In 2017 interna- 972.
tional conference on computer and drone ap-
[69] Vanhie-Van Gerwen, J., Geebelen, K., Wan,
plications (IConDA) (pp. 102-107). IEEE.
J., Joseph, W., Hoebeke, J., and De Poorter,
E. (2021). Indoor Drone Positioning: Accu-
[60] Shafiee, M., Zhou, Z., Mei, L., Dinmoham-
racy and Cost Trade-Off for Sensor Fusion.
madi, F., Karama, J., and Flynn, D. (2021).
IEEE Transactions on Vehicular Technology,
Unmanned aerial drones for inspection of off-
71(1), 961-974.
shore wind turbines: A mission-critical fail-
ure analysis. Robotics, 10(1), 26. [70] Wang, C., Wang, Y., Xu, M., and Crandall,
D. J. (2022). Stepwise goal-driven networks
[61] Song, Y., Steinweg, M., Kaufmann, E., for trajectory prediction. IEEE Robotics and
and Scaramuzza, D. (2021, January). Au- Automation Letters, 7(2), 2716-2723.
tonomous drone racing with deep reinforce-
ment learning. In 2021 IEEE/RSJ Interna- [71] Yao, Y., Atkins, E., Johnson-Roberson, M.,
tional Conference on Intelligent Robots and Vasudevan, R., and Du, X. (2021). Bi-
Systems (IROS) (pp. 1205-1212). IEEE. trap: Bi-directional pedestrian trajectory
prediction with multi-modal goal estima-
tion. IEEE Robotics and Automation Letters,
6(2), 1463-1470.
[72] Yayla, G., Van Baelen, S., Peeters, G., Afzal,
M. R., Singh, Y., and Slaets, P. (2021). Ac-
curacy benchmark of Galileo and EGNOS for
Inland Waterways. In Proceedings of the In-
ternational Ship Control Systems Symposium
(iSCSS) (pp. 1-10). Zenodo.
[73] Yue, Z. (2018). Dynamic Network Recon-
struction in Systems Biology: Methods and
Algorithms (Doctoral dissertation, Univer-
sity of Luxembourg, Luxembourg).
[74] Vicon Company. Tracker system - User
guide. (2019, December). Software. From
URL: https://www.vicon.com/cms/wp-
content/uploads/2019/08/tracker-3-
11042017-55587.pdf
[75] Vicon Company. Cameras - Technical specifi-
cations. (2020, June). Hardware. From URL:
https://www.vicon.com/hardware/cameras/
[76] Castiblanco, J.M., Garcı́a-Nieto,
S., Ignatyev, D., Blasco, X.,
(2022, Mayo 27) From URL:
http://figshare.com/s/24642072abc29b8f1535
[77] Castiblanco Quintero, J. M., Garcia-Nieto,
S., Ignatyev, D., and Blasco, X. (2022). The
TRAM-FPV RACING Open Database. Se-
quences complete indoor flight sequences for
the study of racing drones. In XLIII Jornadas
de Automática (pp. 341-352). Universidade
da Coruña. Servizo de Publicacións.
2022-09-09
Castiblanco, J. M.
University of Rioja
Castiblanco JM, Garcia-Nieto S, Ignatyev D, Blasco X. (2022) THE TRAM-FPV RACING Open
Database. Sequences complete indoor flight tests for the study of racing drones. In: XLIII
Automatica 2022, 7-9 September 2022, Logrono, La Rioja, Spain
https://dspace.lib.cranfield.ac.uk/handle/1826/18343
Downloaded from Cranfield Library Services E-Repository