Continuous Auto UAV Inspections For FPSO Vessels

Download as pdf or txt
Download as pdf or txt
You are on page 1of 100

Department of Electrical Engineering

Continuous Autonomous UAV Inspection for FPSO vessels


Candidate number: 1
Alexey G Andreev
Master’s thesis in Aerospace Control Engineering…STE-3900-1 20H…May 2021
Summary
This Master's thesis represents the preliminary design study and proposes
the unmanned aerial vehicle (UAV) -based inspection framework, compris-
ing several multirotors with automatic charging and deployment for 24/7
integrity inspection tasks. This project has three main topics. First one de-
scribes the operational environment and existing regulations that cover use
of UAVs. It forms the basis for proposal of the relevant use-case scenarios.
Third part comprises two chapters, where design of concept and framework
is being based on the previous factors. It shows that before implementation
of fully autonomous inspection system, there is a need to cover both regula-
tory and technical gaps. It can be explained by the fact that there does not
exist any autonomous inspection system today. Thus, this project can be
seen as a base for future development of the UAV-based inspection system,
as it focuses on creation of a general framework.

I
Preface
This thesis is submitted as the partial fulllment of the requirements for the
Master's degree in Aerospace Control Engineering at the UiT  The Arctic
University of Norway. The project is based on the assignment proposed by
Equinor ASA during the spring of 2021.
I would like to thank my main supervisor, Prof. Raymond Kristiansen, for his
valuable advice and regular support. I would also thank Marius Paulsen
Haugen and Roy Ivar Nielsen at Equinor ASA dep. Harstad for their sup-
port and help when needed. All of you have provided the positive working
environment and interest you have shown for my work helped me to keep in-
spiration throughout the project.
Special thanks go to my mother, grandmother and brothers. I would
never be where I am today without your support.

II
Report outline
This thesis is divided into seven chapters. After introductory, where we
discuss existing solutions related to autonomous inspection of vessels and
opportunities related to the use of robotic arms for inspections and main-
tenance tasks. In Chapter 2, we get an understanding of the environment
where all operations will be performed. It also contains some basic require-
ments for drones and supply infrastructure, discussion about challenges of
ying in explosive atmospheres. Based on this information, then in Chap-
ter 3, there are proposed possible use-case scenarios which may be relevant
to be used at the Johan Castberg vessel. Chapter 4 proposes the con-
cept prototyping, system architecture and discussion about how to get a
fully autonomous system. Chapter 5 complements with information about
eet conguration and what kind of infrastructure we need in order to get a
workable system. Chapter 6 is based on discussion about the existing reg-
ulatory and technological gaps, how they aect use-case scenarios, and what
is needed to cover them. Finally, Chapter 7 describes nal conclusions and
prospect for future work.

III
Contents
1 Introduction 2
1.1 Background and motivation . . . . . . . . . . . . . . . . . . . 2
1.2 Project limitations . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . 3

2 Operating Environment and Basic Requirements 7


2.1 Johan Castberg oil eld . . . . . . . . . . . . . . . . . . . . . 7
2.2 FPSO vessels . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.3 Barents Sea  Weather and Key Phenomena . . . . . . . . . . 8
2.3.1 Wind, air temperature and waves . . . . . . . . . . . . 8
2.3.2 Key Phenomena . . . . . . . . . . . . . . . . . . . . . 9
2.4 Regulations and Requirements . . . . . . . . . . . . . . . . . . 11
2.4.1 General Directives and Regulations related to use of
UAV . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.4.2 Technical requirements . . . . . . . . . . . . . . . . . . 13
2.4.3 Operating in ATmospheres EXplosibles . . . . . . . . 15

3 Use Case Scenarios 20


3.1 Use-case 1: Structure and mechanics inspection . . . . . . . . 22
3.2 Use-case 2: Environment monitoring . . . . . . . . . . . . . . 25
3.3 Use-case 3: Safety . . . . . . . . . . . . . . . . . . . . . . . . 27
3.4 Use-case 4: Maintenance. Use of Aerial Manipulators . . . . . 29

4 Concept 32
4.1 Choice of suitable drones . . . . . . . . . . . . . . . . . . . . . 32
4.2 Inspection techniques . . . . . . . . . . . . . . . . . . . . . . . 36
4.3 Frames of reference (coordinates) . . . . . . . . . . . . . . . . 38
4.4 Concept Denition . . . . . . . . . . . . . . . . . . . . . . . . 40
4.5 Preferred System Architecture . . . . . . . . . . . . . . . . . . 41
4.6 Concept Exploration . . . . . . . . . . . . . . . . . . . . . . . 46
4.7 Autonomy levels . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.8 Landing pad design . . . . . . . . . . . . . . . . . . . . . . . . 50

5 Framework 55
5.1 Fleet congurations . . . . . . . . . . . . . . . . . . . . . . . . 55
5.2 Flight logistics . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.2.1 Automatic scheduling . . . . . . . . . . . . . . . . . . 59
5.2.2 Path planning . . . . . . . . . . . . . . . . . . . . . . . 61

IV
5.2.3 Collision avoidance . . . . . . . . . . . . . . . . . . . . 63
5.2.4 Positioning . . . . . . . . . . . . . . . . . . . . . . . . 65
5.3 UAS - subsystems and supply infrastructure . . . . . . . . . . 69

6 Discussion 71
6.1 Technological and Regulatory gaps . . . . . . . . . . . . . . . 71
6.2 Use-case scenarios . . . . . . . . . . . . . . . . . . . . . . . . 72
6.3 Implementation sequence . . . . . . . . . . . . . . . . . . . . . 72

7 Conclusion and Future work 74


7.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
7.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
7.2.1 Practical aspects . . . . . . . . . . . . . . . . . . . . . 75
7.2.2 Theoretical aspects . . . . . . . . . . . . . . . . . . . . 75

A Regulation on aircraft without pilot onboard, selected para-


graphs (original text in Norwegian) 88

List of Figures
1 Johan Castberg oileld on map [1] . . . . . . . . . . . . . . . 7
2 FPSO vessel and subsea system [1] . . . . . . . . . . . . . . . 8
3 Johan Castberg FPSO . . . . . . . . . . . . . . . . . . . . . . 9
4 Visualization of dierent twilight [2] . . . . . . . . . . . . . . 14
5 Visual example for ATEX zone classication [3] . . . . . . . . 16
6 ATEX zone on Johan Castberg . . . . . . . . . . . . . . . . . 17
7 Most common degradation mechanisms: (a) wear in paint (b)
welding defects (c) pitting corrosion (d) buckling . . . . . . . 21
8 Hull Structure [4] . . . . . . . . . . . . . . . . . . . . . . . . . 23
9 Collision tolerant Flyability Elios drone [5] . . . . . . . . . . . 26
10 Sea spray icing on ships [6] . . . . . . . . . . . . . . . . . . . 30
11 Classication of UAV based on aerodynamics and weight . . . 32
12 (a)Tiltrotor [7] and (b)hybrid xed-wing UAVs [8] . . . . . . . 34
13 Helicopter swashplate setup . . . . . . . . . . . . . . . . . . . 35
14 Simplied design of couplant supply system . . . . . . . . . . 38
15 Frames of reference ( objects are not in the same scale ) . . . . 39
16 Objectives tree . . . . . . . . . . . . . . . . . . . . . . . . . . 40
17 System setup [9] . . . . . . . . . . . . . . . . . . . . . . . . . 42
18 Setup of Mission Repository . . . . . . . . . . . . . . . . . . . 43
19 Vessel structures that are of interest for inspection . . . . . . 43

V
20 Mission Calculation Engine[9] . . . . . . . . . . . . . . . . . . 45
21 Overall owchart . . . . . . . . . . . . . . . . . . . . . . . . . 47
22 (a) Inatable rubber boot [10] and (b) schematic layout of the
heating zones [11] . . . . . . . . . . . . . . . . . . . . . . . . . 53
23 Example of multirotor' landing pad (LP) . . . . . . . . . . . . 54
24 Bow-Starboard-Port-Stern zoning . . . . . . . . . . . . . . . . 57
25 Example of drone conguration . . . . . . . . . . . . . . . . . 57
26 Simple duty cycle for one drone . . . . . . . . . . . . . . . . . 59
27 Simulation-based scheduling system framework [12] . . . . . . 60
28 Division of randomly generated GA chromosome [12] . . . . . 61
29 Basic inspection patterns: (a) strip method (b) Archimedes
spiral (c) spiral . . . . . . . . . . . . . . . . . . . . . . . . . . 62
30 Simple waypoint grid [13] . . . . . . . . . . . . . . . . . . . . 63
31 Example of obstacle gradation . . . . . . . . . . . . . . . . . . 64
32 Structure of Collision avoidance system [14] . . . . . . . . . . 64
33 Proposed set up of outdoor navigation system . . . . . . . . . 66
34 Helideck at dierent lighting conditions oshore: (a) night
[15] (b) daylight [16] . . . . . . . . . . . . . . . . . . . . . . . 66
35 Visualization of TDOA method (2D space) . . . . . . . . . . 68
36 Example of QR code (a) and ArUco (b) . . . . . . . . . . . . 69
37 Supply infrastructure  communication architecture . . . . . . 69

List of Tables
1 Short overview of the Open category . . . . . . . . . . . . . . 12
2 Cx-marking of drones . . . . . . . . . . . . . . . . . . . . . . 13
3 Classication of the ATEX zones . . . . . . . . . . . . . . . . 16
4 Exterior inspection: structure components and expected weak-
nesses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5 Task priorities . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
6 Autonomy levels gradient [17] . . . . . . . . . . . . . . . . . . 49
7 Autonomy implementation gradient . . . . . . . . . . . . . . . 51
8 Landing pad specications . . . . . . . . . . . . . . . . . . . . 54
9 Proposed regularity of tasks and drones that could be used . 56
10 Proposed eet congurations . . . . . . . . . . . . . . . . . . 58

VI
Abbreviations
AGL (height) Above ground level
BLOS Beyond visual line of sight
CG Center of gravity
DOF Degree of Freedom
EASA European Union Aviation Safety Agency
ERT Emergency response team
EVLOS Extended visual line of sight
FOV Field of view
GNSS Global navigation satellite systems
HFIS Helicopter ight information service
HLO Helicopter landing ocer
IACS International Association of Classication Societies
ID Identication
IR Infrared
LPS Local positioning system
MTOM Maximum takeo mass
ND-IR Non-dispersive infrared
NDT Nondestructive testing
NED North East Down reference frame
NOTAM Notice to airman
PAV Pico air vehicle
PDA Personal digital assistant (also known as handheld PC)
RMZ Radio mandatory zone
RPAS Remotely Piloted Aircraft System
RVI Remote Visual Inspection
SERA Standardised European Rules of the Air
SWIR Short-wave infrared light
UAS Unmanned aerial systems
UAV Unmanned aerial vehicle
UUV Unmanned underwater vehicle
VLOS Visual line of sight

Acronyms
cat. category
w/ with

VII
1 Introduction
In the rst quarter of the XXI century, when use diverse types of ecologic-
friendly energy sources, such as solar or wind energy, does not surprise
anyone, the oil and gas production is still relevant and plays a signicant
role.
At the same time Unmanned Aerial Vehicles (UAVs or drones) being also
used more in everyday life. They are performing a lot of dierent types of
tasks and vary in complexity of design. Being widely used onshore, they
are not that much presented in maritime operations. Even today there still
exist both technical and regulatory gaps in activities related to autonomous
inspection of the ships.
Oil extraction in arctic sea regions is quite challenging even in our modern
days. Workers and machines often work in extreme conditions. To reduce
risks and improve eciency, new drone- and robot technologies are coming
for help. Energy industry sets focus on increased use of drones and robotic
technologies in dierent scenarios. Its goal is to increase safety for a crew
and increase the production eciency on the shelves.

1.1 Background and motivation


This project is given in cooperation by University of Tromsø (UiT) and
Equinor ASA, and is based on the Preliminary Literature Review project,
done in December 2020 [18].
The use of FPSO vessels (Floating Production, Storage and Ooading)
in oil and gas production is becoming increasingly popular, enabling oshore
handling of all pars of the petroleum extraction processes. These types of ves-
sels are then located close to the oil eld for extensive time periods, and must
maintain operation in harsh weather conditions. Thus, the need for contin-
uous inspection and maintenance tools is pertinent and required, for which
unmanned aerial vehicles (UAVs) can oer a robust and reliable solution.
The general objective for this project is therefore to perform preliminary de-
sign study into an autonomous UAV inspection framework comprising several
multirotors, allowing continuous operation without human intervention, for
performing specic inspection tasks on a FPSO vessel.

Subtasks
ˆ Perform a literature review on autonomous drone inspection in general

and for FPSO vessels in particular.

2
ˆ Suggest specic inspection tasks that are suitable for multirotors, and

develop some use case scenarios and usability studies. Particular use
cases to study is FPSO tank inspection, and the possibility to use
drones with gripper arms.

ˆ Based on some of the use cases, suggest a framework comprising several

multirotors for continuous inspection, including automatic docking and


charging. Take special considerations for robustness requirements for
the UAVs, as well as requirements directed by the operating environ-
ment (EX requirements), and necessity for supporting infrastructure
(positioning, communication).

ˆ Based on the Johan Castberg FPSO, establish a scheme for automatic

scheduling, ight logistics and path planning to ensure continuous op-


eration and coverage.

1.2 Project limitations


The idea about how to use unmanned aerial vehicles (UAV) for autonomous
inspection of the vessels has been proposed only few years ago. Having a
such new eld of study creates the rst barrier  existence of regulatory and
technological gaps, which needs to be covered prior implementation of the
system with desired level of autonomy. This resulted in making of some what
if  assumptions and general discussions based on available information.
The autonomous UAV-based inspection system for oshore operations is
a complex structure that combines many aspects, ranging from legislation,
maritime operations, meteorology, aircraft control, algorithm design, and
others. It led to more time spent to get the specic knowledge base, than it
was planned initially.
To perform inspections or other tasks on vessels, we can use not only
UAVs but also crawling robots or underwater vehicles to inspect areas that
are not reachable by UAVs. They could expand the capabilities of the UAVs,
but to include them into concept design would require much more additional
time. Due to strict time constraints, it was decided to not include them into
this project but leave for potential future study.

1.3 Literature review


There already exists a lot of variable solutions for manual drone inspections
in multiple civil scenarios [9][19]. It can wary from soil pollution and vege-
tation monitoring to determination of volatile chemical concentrations and

3
gas leaks detection at chemical plants. It was done many researches so even
guidelines for optimal ight path were derived and also ability of dierent
sensors to perform in dierent light conditions [19][20][21].
When it comes to maritime and oil rig inspection the things become
not so bright. According to European Research Project ROBINS (Robotics
Technology for Inspection of Ships) [22], done as a part of the European
Union's Research and Innovation programme "Horizon 2020" [23], there still
exists both technology and regulatory gaps when it comes to the adoption
of Robotics and Autonomous Systems (RAS) in maritime inspections. As
one of such gaps we can mention navigation inside cargo tanks. According
to ROBINS, automated navigation with correct motion estimation is not
solved yet for the case of ship inspection [24]. Most of the projects are
related to the inspection of cargo holds and tanks (on bulk and oil tankers)
only, sometimes they also include inner compartments of the ships [25][26].
There also exists some projects on oil rigs' outer inspection [27][28]. In all
these projects, drones were manually controlled by experienced pilots [29].
In all cases, they pointed out weather conditions, namely wind, as the most
challenging impact on the drone operation.
Most of the similar projects started in last 2-3 years. The biggest existing
project that has been found is the previously mentioned ROBINS project,
that started in 2020. Hence, we can see that this line of research is relatively
young. So, we will base us on existing solutions that are used for onshore
inspections.

Robotic Arms
One of the ways to improve the drone's performance for inspections and
to perform the maintenance tasks is to equip them with gripper arms (also
called "aerial manipulator" when installed on UAVs). They will not only
work for improvement of the performance but will also allow to sense in
dicult to reach or dangerous zones.
The aerial manipulators can perform variety types of tasks. It can be
simple "grasp-and-transport" [30], cable-suspended load lifting [31], remote
opening of valves [32], or more advanced and complicated as structure main-
tenance using several manipulators installed on one drone, which being de-
veloped under AEROARMS project [33].
More often such manipulators are used to perform nondestructive testing
of dierent constructions, such as bridge beams [34], ultrasonic thickness
measurement of oil renery [35]. In marine inspections usually we do not
need to use all available inspection techniques (sensors) simultaneously. So

4
having few detachable end eectors can save in drone's weight and thus get
longer ight time. It will also make sensor's service or replacement easier.
We can divide robotic arms into several groups, based on several factors.
The most common one is by their working principal [36]:

ˆ Vacuum

ˆ Pneumatic

ˆ Hydraulic

ˆ Servo-electric

Another way, is by the end-eector type [37]:

ˆ Gripper - they are also divided into subgroups:

 Multi-nger adaptive;

 Parallel Motion Jaw

 Claw

ˆ Process Tools

ˆ Sensors

Because of its small weight and no use of liquids/oils nor compressors, prob-
ably the most suitable type of end eector is servo-electric.
The problem, or challenge, of the aerial manipulator usage is in complex-
ity of the kinematics and control[38] due to coupling of the manipulator's-
and drone's dynamics. The challenge of the aerial manipulator usage is in
complexity of the kinematics and control [38] due to dynamics of the cou-
pling of the manipulator with drone. There are three eects that complicate
behavior of the drone with attached manipulator [39]:

ˆ Displacement of the mass center from the drone's vertical axis

ˆ Variation of the mass distribution during arm manipulation

ˆ Additional dynamic forces and torques that occurs during arm manip-

ulations.

This problem can be simplied by using manipulators with less DOF (2, 3
or 4). It will decrease mobility of the arm and will need the compensation
in form of horizontal/vertical movement of the drone, and still manipulator

5
will be dependent on the attitude of the UAV. Such simplication can be
used during NDT (Non-Destructive Testing) tests inside an oil/ballast tanks,
but for outer surveys we will need the more freely movable arm, which has
at least 6 DOF. Because the vessel is in constant movement about its axes
and due to varying weather conditions, it is necessary for the drone to be
able to actively compensate it. Another solution is to use drones that have
electromagnets to stick to the surface, have tilting rotors or one additional
rotor to press drone against the surface to hold it in place.

6
2 Operating Environment and Basic Requirements

2.1 Johan Castberg oil eld


The focus for this project is the Johan Castberg oil eld. It is located
in southwest part of the Barents Sea, 240 km north from Hammerfest in
Norway, as can be seen in Figure 1. The Johan Castberg eld includes three

Figure 1: Johan Castberg oileld on map [1]

oil reservoirs: Skrugard, Havis and Drivis. According to the plans of the
Equinor [40], the extraction of the crude oil and gas will start in 2023 and
last for 30 years. The expected volume of the extracted resources is equal to
450  650 million barrels (approximately 61  90 million ton). To be able to
increase the worker's safety and eectivity at Norwegian continental shelf,
there is a focus on the use of drones and robotic technologies.

2.2 FPSO vessels


The main principle of the oil and gas extraction from the oileld is based
on use of a FPSO vessel ("Floating, Production, Storage and Ooading")
in cooperation with subsea solutions. These types of vessels, typically based
on converted oil tankers, are equipped with hydrocarbon processing units
that process and separate the extracted crude oil into rened oil, gas, and

7
water. The extracted and processed oil then being stored on board in cargo
tanks and later being transferred to land via shuttle tankers. Gas can be
transited further to land via pipelines or used for on-board power generation.
The FPSO vessel is meant to be moored and connected to the subsea
production systems by exible owlines. The overall overview of the standard
FPSO solution can be seen on the Figure 2.

Figure 2: FPSO vessel and subsea system [1]

Johan Castberg vessel (Figure 3) is meant to be the digital agship" of


the Equinor's eet and will be used as a base for testing of innovative tech-
nologies, including unmanned aerial vehicles. It has following basic charac-
teristics:

ˆ Total length  300 m

ˆ Width  50 m

ˆ Height above the waterline  30 m

2.3 Barents Sea  Weather and Key Phenomena


2.3.1 Wind, air temperature and waves
Our area of interest is the southwest zone of the Barents Sea. This zone has
specic climate conditions since it is aected by warm southern ows from
the Atlantic Sea and cold streams of Arctic air masses from the north. Such
combination can lead to high variability of weather conditions.

8
Figure 3: Johan Castberg FPSO

According to the statistics from the Russian center for World Ocean
monitoring (ESIMO), this region is strongly inuenced by cyclonic circula-
tions and warm Norwegian coastal current [41]. This follows to small daily
and interannual variability of the air temperature, stable wind direction but
at the same time we should expect frequent precipitation and considerable
cloudiness.
A mean wind speed is relatively slow - usually up to 10 m/s [42]. During
winter wind speed increases severely, so we can expect maximum speed up
to 30 m/s (maximum 1-hour mean).
Since the southwest region is aected by the cyclones and warm ocean
streams, the air temperature is quite high (relatively to the high North re-
gions), usually around -4
◦ C during winter and +9 ◦ C at summer.

The mean height of waves varying at approximately 2.5 m. During win-


ter, the maximum signicant height can come up to 15 m or 7 m during the
summer.

2.3.2 Key Phenomena


When the clouds are more relevant for the high-altitude operations, we can
suer from advection fog [43]. This type of fog occurs when warm air masses
pass over colder water surfaces. The depth of the advection fog depends on
moisture content in air, wind and temperature dierence between air and
water. Since it can last for a longer period under specic conditions, it can

9
have strong impact on the performance of optic sensors on the drone.
Other phenomena that we can meet in the Barents Sea is so-called Polar
Low, it also known as arctic hurricane, polar mesoscale vortex, or cold air de-
pression. Unlike the advection fog, these hurricanes occur during wintertime,
when cold air masses pass over warm water surface. A strong up-warded air-
ow is created, which lead to reduced pressure in local areas. This eect
becomes worse when the upper atmosphere is also cold. Due to its small
scale (diameter does not exceed 1000 km, usually 200-600 km) and short
lifetime (up to few days), they are dicult to forecasting. For the drone op-
erations they pose a danger in form of rapidly increasing wind (minimum 15
m/s with gusts up to 55 m/s), snow and/or hail showers, large wave growth
and visibility reduction to less than 100 m [44].
Another weather phenomena that we can meet during oshore operations
is the icing. It comes in several types: Atmospheric icing and ice accretion
by sea spray [45]. Beside of vessels instability, icing can cause other risks,
such as

ˆ slippery decks and helicopter landing pad

ˆ degradation or loss of communication due to ice on antennas

ˆ stronger lateral wind due to increased size of structure components

ˆ reduced visibility

ˆ construction elements being blocked from inspection/sensors due to ice

build-up

ˆ falling ice

Atmospheric icing that can occur during oshore work can be divided into
two subgroups: wet snow icing and freezing rain. For their accretion follow-
ing conditions must present: Wet snow icing occurs snow falling at temper-
atures between 0 - 3
◦ C for wet snow icing and r water or drizzle drops onto

surfaces with temperature below 0 C for freezing rain.
Sea spray icing is similar to the freezing rain, but it depends on wind
speed, temperature of water and air, wave height. It can occur on vessels
and structures under the following conditions:

ˆ high wind speed  usually above 9 m/s, sometimes lower

ˆ low air temperature  under -1.7


◦C

ˆ low water temperature  under 7


◦C

10
Since the oil extraction is performed in the High North regions, we should
expect decreasing of optic sensor's performance due to polar night. Use of
articial lighting during that period can also aect the quality of sensing
results. But when polar night plays negative role for optical instruments
(since daylight and solar radiation aect the emissivity of the dierent ma-
terials that can give false positive results [19]) use of thermal imagery is more
reliable [46].

2.4 Regulations and Requirements


2.4.1 General Directives and Regulations related to use of UAV
According to research results of members of ROBINS project, there are no
direct regulations for remote inspection techniques of the marine inspections.
There were not found any such regulations among Norwegian laws and direc-
tives neither, nor regulations related to autonomous UAS (Unmanned Air-
craft System). The only mentioning of use of UAV for the ship's inspections
were found in Requirements Concerning Survey and Certication, produced
by IACS (International Association of Classication Societies) [47]. But it is
only in means of additional inspection tool for visual inspection of hard-to-
reach areas. There are only general rules related to use of the UAV/drones.
Since there exist such a regulatory gap, we will base this research on exist-
ing regulations, allowing some assumptions in derivation of statements and
decisions.
From the 01.01.2021 new regulations for UAS pilot certication in Nor-
way are applied [48]. Drone operators that were licensed according to the
older regulations (RPAS or so called "RO x" categories) before 01.01.2021
can continue UAS operations until 31.12.2021. New operators will need to
proceed the processes of licensing according to new regulations.
In this work, older classication will be used, because it is still appli-
cable and new rules related to specic use purposes are still not fully de-
1
ned (as "Certied" category ). Also, new regulations are manly related to
hobby/recreation drones (so-called "Open category) which has strong restric-
tion of "weight/distance from people" relation (see Table 1 [49] and Table 2
[50]). Also, remotely piloted aircraft system (RPAS) categories are also still
used in Equinor' documentation related to use of drones.
Use of drones is regulated by Norwegian Civil Aviation Act (Luftfart-
sloven): "Regulation on aircraft without pilot onboard
2 [51]. These regula-

1
per 05.02.2021
2
Norwegian: "Forskrift om luftfartøy som ikke har fører om bord mv."

11
Table 1: Short overview of the Open category

Cat. Name Description


A1 "Over people" - Can y C0 and C1 class of drones
- C1-UAVs and drones with maximum takeo
mass (MTOM) >250g can not be own over other
people
A2 "Close to peo- - C2 class drones
ple"
- C2 drones must maintain 30m horizontal dis-
tance from other people
- non-CE drones with weight max 2kg must main-
tain 50m distance
- "1:1" rule applied: drones must maintain same
horizontal distance from people as the height
above ground level (AGL) is
- Pilot have to pass theoretical exam
A3 "Away from - C2, C3, C4 drones
people"
- >150m distance from residential, commercial,
industrial or recreational areas
- There should be no people other than those in-
volved in the drone ying
- If other person present: follow 1:1 rule with min
30m distance

tions are applied also for model UAVs and drones that are ying in airspace
over Norwegian continental shelf and Norwegian economic zone. There are
exists three drone operator categories: RO 1, RO 2, RO 3. They dene
requirements for organizations and sets limits for the UAV that can be op-
erated. According to this regulation a company, which will operate drones,
must have a team that consist of:

ˆ Responsibility manager  responsible for the UAS division itself

ˆ Operation manager  responsible for that all ight operations happens

according to existing regulations and laws

ˆ Technical manager  manage that all drones are in required technical

condition

12
Table 2: Cx-marking of drones

Cx-class MTOM [kg]


C0 <0.25
C1 <0.9
C2 <4
C3 and C4 <25

ˆ RO 1 - aircraft with MTOM <2.5 kg, max speed 60 knots (30 m/s or
111 km/h), being operated within VLOS and safety distances dened
by Ÿ 51
3
ˆ RO 2 - aircraft with MTOM <25 kg and max speed 80 knots (41 m/s
or 148 km/h). Operates within VLOS or EVLOS and within safety
distances dened by Ÿ 51 or BLOS in accordance with Ÿ 56  Ÿ 59

ˆ RO 3 - aircraft with MTOM ≥25 kg or max speed ≥80 knots or op-


erates BLOS w/ altitude over 120 m or operates in controlled airspace
w/altitude over 120 m or operates near crowd greater than it is de-
scribed in Ÿ 51

While the minimum criteria is RO 1 certicate the RO 2 category is


required, to be able operate oshore, including vessels and transit.
Beside of the ocial certication, the drone operator (pilot) must be
approved by Flight Safety department in Equinor, which monitors all ight
activities, including UAS, within the company.

2.4.2 Technical requirements


According to the existing regulations, drones (such as plans and helicopters)
must be constructed also with respect to existing aviation standards. In
addition to [51] and OM105.19 [52], there is also an EASA SERA rules
[53]. According to them all drones have to be equipped with predened
set of sensors and lights. Most of these setups are used to achieve the safe
piloting at nighttime. This becomes more important when operating in high
North regions, when we need to deal with long lasting polar nights. By
"night" EASA means the period between the end of evening civil twilight
and beginning of the morning civil twilight [54], see Figure 4.

3
These paragraphs can be found in Appendix A

13
Figure 4: Visualization of dierent twilight [2]

To be able to y beyond line of sight (BLOS), drone must have a white


strobe light with minimal intensity 10 candelas and perform minimum 20
ashes per minute. For night ights navigation (direction) lights should
be installed. They are used to indicate the relative path of the drone to an
observer/operator. Most common way is to use green and red lights mounted
on the right and left axle arms, respectively. Considering that there is still a
requirement to have an external observer/assistant under ying, all ights is
performed by manual control during daytime now. That means that there are
no actual BLOS or EVLOS (extended visual line of sight) ights. Yet we are
aiming for the automation of the inspection process, so these assistant roles
will be replaced by other solutions, as for example by video surveillance. The
proper illumination set up on the drone is necessary due to active helicopter
trac in the area.
To be able to use non-certied drones inside of the ATEX
4 classied
zones, they must carry relevant gas sensor. It will not only be used to t up
the requirements but will also be part of inspection equipment for sensing
possible gas or chemical leaks. Another sensor that we need to have onboard
is an altimeter. Its role is to prevent violation of the altitude restrictions.
Finally, each drone must have unique ID mark/token and be marked with
operator's name and contact phone number.
In case of the remote-control failure, all drones must have an automated
landing system. In our case, drones should have a backup radio connection
that ensures telemetry and telecommand transmission to ensure that drone

4
ATEX  ATmospheres EXplosibles, potentially explosive atmospheres

14
always has updated its own and vessel's position. Most of these requirements
will be naturally fullled because they are critical for the successful execution
of the mission.
One additional requirement to remember is that neither drones or any one
component of the system should provide any risks for the crew and for techni-
cal equipment/components of the vessel and surroundings. There should be
performed thorough assessment for choosing of dierent components (such
as materials, motors, batteries, etc.) that will be used.

2.4.3 Operating in ATmospheres EXplosibles


An FPSO vessel has high explosion danger. To be able to perform any in-
spection or maintenance tasks, drones and all supply infrastructure should
be designed, manufactured, implemented, and run according to specic reg-
ulations and directives.
The basic directive for equipment (i.e. drones and its supply infrastruc-
ture) and protective systems are ATEX Directive 2014/34/EU (also known
as ATEX114) which has replaced an older Directive 94/9/EC [55]. Beside
of the concentration of the ammable/explosive gas, vapor or mists, the
new directive also include concentration of potentially dangerous dust, and
probability of the ignition from mechanical and electric systems.
There is also the directive for improving the safety and health protection
of workers: Directive 1999/92/EC [56]. In Norway, the regulation equivalent
called "Regulation for equipment and safety systems to be used in hazardous
5
areas " [57] is used.
To get an ignition of ammable materials or gasses, three conditions have
to meet: fuel source, oxidant, and ignition source. Fuel source is simply
combustible dusts/gas or ammable materials. In case of oil tanker or an
FPSO, we cannot cut it out, so other solutions are needed. One of the ways to
prevent ignition is to remove the oxidant, where oxygen is the most common
substance. To achieve that, inert gasses can be lled into the tanks to reduce
amount of oxygen. Finally  ignition source. UAVs and their supporting
infrastructure usually contain electrical and mechanical components, that
can represent ignition sources. It can be in form of heating, electric sparks,
or electrostatic discharge.
There are dened so-called ATEX-zones, that are derived from the United
States' HAZLOC (hazardous locations) standards [58], based on the gas/vapor

5
in Norwegian: "Forskrift om utstyr og sikkerhetssystem til bruk i eksplosjonsfarlig
område"

15
or dust concentration in the air that may cause an so-called "explosive at-
mosphere", see Table 3. It is important to mention that variation of h/year
is not ocially dened, but rather an attempt to place time limits into zones
[59]. Areas that has not been divided into one of the mentioned classes,
are classied as safe or non-hazardous. The simple visual example of such
classication on schemed gas station can be seen in Figure 5.

Table 3: Classication of the ATEX zones

Probability of the gas Zone code for combustible Zone code for
or dust are present gas, vapor and mist combustible dust

Present permanently
or for long period (>1000 h/year) Zone 0 Zone 20
Present during normal operations
>10 h/year and <1000 h/year Zone 1 Zone 21
May occur
<10 h/year Zone 2 Zone 22

Figure 5: Visual example for ATEX zone classication [3]

In our situation drones and their supply systems will operate under con-
ditions when explosive atmospheres are presented during normal operations
or may occur (depends on place on the vessel, see Figure 6). So, they should
be classied as Equipment group 2  category 2 and 3 respectively [57].
Gear that will be installed in Safe zones does not fall into any of the ATEX
classication paragraphs, but still should be certied according to internal
standards. Based on this classications and national laws, we get following

16
Figure 6: ATEX zone on Johan Castberg

requirements for the equipment


6 that will be operating/installed in haz-
ardous zones:

ˆ General Requirements:

 The equipment shall not pose danger by itself or emit explosive


atmosphere.

 The equipment shall not pose possibility for ignition of the explo-
sive atmosphere by its electric and non-electric components.

 Risk assessment shall be performed prior to equipment's construc-


tion, production or operation

 Possible human factor should be included into risk assessment.

 Materials that are used in equipment should not go into reaction


with environment and constituents of the explosive atmosphere.

 Components should not pose any potential ignition source as


sparks, ames, electric arcs, high surface temperature, optical ra-
diation, electromagnetic waves, electrostatic discharge and others.

 Components of the equipment should be capable to perform under


expected stress and resist strain from aggressive substances that
are present or may occur.

6
here: all physical parts and components of the inspection system, such drones and
installed external equipment (e.g. sensors and aerial manipulators), UAV landing pads,
navigational radio beacons, etc.

17
 If failure occur, there should be possibility for equipment to switch
into safe mode.

 Equipment that is operating in autonomous mode, should be able


to be stopped manually in a safe way if operating conditions
change beyond its assumed limits.

ˆ Extra requirements for equipment of group 2 category 1:

 In case of failure of a protective system, there always should be at


least one reserve system, such that required level of safety remains.

 In case of two independent failures occurring simultaneously the


required level of safety should be remained.

 If the equipment's surface can become hot, it should be ensured


that the expected maximal temperature is not exceeded, even in
unexpected environmental conditions.

 It should be ensured that the surface temperature is considerably


lower than ignition temperature of the explosive atmospheres.

ˆ Extra requirements for equipment of group 2 category 2:

 Equipment should be designed and manufactured in a such way


that it does not present an source of ignition (also in case of
possible damage or failure).

 Equipment and its parts should not be heated over desired limits,
also in abnormal situations that were anticipated by manufac-
turer.

ˆ Extra requirements for equipment of group 2 category 3:

 Equipment should be designed and manufactured in a way way


that it does not present any source of ignition.

 The temperature of the surface should not exceed desired limits


during normal operations. In special cases these limits can be
exceeded only if the manufacturer includes additional protective
solutions.

Per 15.02.2021 there does not exist any ATEX approved drones. Only
two quadrotors that could be operated there:

ˆ "Parrot Bebop 2 light cage drone for inspection in Hazardous areas" -

prototype, light weighted (0.5 kg or 1.1 lbs) that can be used for indoor
inspections, ATEX Zone 1 or 2 rated ( certication pending ) [60]

18
ˆ "Explosion Proof Drone" - features maximum distance of 3.2 miles (4

km) and 22 minutes of ight time, MTOM 15 kg (33.3 lbs) so it can be


used for outer inspection. ATEX Zone 1 rated ( certication pending )
[61]

19
3 Use Case Scenarios
The main tasks for the implemented UAV systems are to perform dierent
types of inspection and maintenance on FPSO vessel.
This type of vessels is a large and mechanically complicated structure.
Being a part of the country's energy policy, they play a signicant role in the
national economy and wealth. By introducing and combining the old classic
systems with modern technologies, we want to achieve few goals: increase
the eectivity of oil/gas production and increase the employee's safety.
By "inspection" we mean the mission when a drone or an array of drones
performs an inspection of the vessel's hull, on-board components/structures
in a way that gives the same or better results that are usually obtained by a
surveyor. All inspections should be done with respect to the maritime orga-
nization's standards [62], company's internal regulations and manufacture's
recommendation.
In the ROBINS project, the objective of ship inspection is dened in the
following way [22]:

"The objective of ship inspection is to verify the structural


strength and integrity of essential parts of the ship's hull and its
appendages, and/or the reliability and function of the propulsion
and steering systems, power generation and those other features
and auxiliary systems that been built into the ship"
To be able to rationalize the inspection and maintenance intervals, as
same as set up relevant scenarios, it is important to identify and understand
various failure and degradation mechanisms that can occur during the ves-
sel's operational time. Rightly identied processes will help to reduce dry
docking time, reduce risk factors, avoid economic and environmental conse-
quences.
Any failure newer happens "by itself". Usually, it is the chain of some
natural processes that unfortunately can cause failures and losses. Speci-
cally, for the FPSO vessels, following degradation mechanisms are considered
[63] (Figure 7)

ˆ Corrosion

ˆ Welding defects

ˆ Wear

ˆ Erosion

20
Figure 7: Most common degradation mechanisms: (a) wear in paint (b)
welding defects (c) pitting corrosion (d) buckling

ˆ Cracks

ˆ Buckling

ˆ Holes

These mechanisms can act alone or combined, vary in intensity, and can
lead to fatal structural damages. Failure mechanisms that can occur as
resulting impact of degradation mechanisms are described as loss of func-
tionality of a structure(s) or system(s). The most usual cause of failure
mechanisms occurrence is missing or inappropriate inspection routines. The
most important consequences we can get are:

ˆ Compartment ooding

ˆ Buoyancy loss

ˆ Fire or explosion

ˆ Leakage

ˆ Structural integrity loss

Based on that information we can derive four main use-case scenarios:

1. Mechanical/structure inspection  by using dierent type of scan-


ners/sensors, nondestructive testing method and image processing, per-
form the control of the vessel's outer and inner structure (such as hull,
on-deck machines and mechanisms, oil/gas/ballast tanks)

21
2. Environmental inspection  by using dierent type of sensors, de-
tect and prevent oil spills, gas leakage and overheating/re;

3. Safety  prevent hazardous situations on-board. In case of any critical


situation, we can get an "overview" image of the scene fast and keep
track of the crew members. If someone falls overboard, we can spend
less time locating the person and by using the additional equipment
on the drones it will take less time to get him or her back to safety.

4. Maintenance  using aerial manipulators, gripper arms and other


tools we can perform simple repair tasks and transport light-weighted
items within the vessel.

These groups are also divided into subgroups, as will be seen later. Some
of them can be related to dierent main groups, in this case they will be
referred according to their "primary" abilities.

3.1 Use-case 1: Structure and mechanics inspection


This is the main scenario for our inspection system. Its goal is to inspect the
vessel's hull, superstructures, processing units, oil/gas, and ballast tanks for
possible wearnesses and deformations.
The main components of the hull structure are side shell plates and re-
inforcements which are connected by welding joints (see Figure 8). Due to
interconnection of these individual components, integrity loss of single ele-
ments can lead to hull/structure failure, ensuing economic or environmental
consequences. Since we want to perform inspection of the outer and inner
components of the vessel, we can split this scenario into two subdivisions:
External and Interior inspections. They have the same goal and are part of
one system, but they require dierent approaches.
Scenario 1.1: Exterior Inspection
During exterior inspection we are using the set of sensing equipment installed
on the UAVs to control the outer hull, machines and construction elements
that are placed on the deck. This includes detection of corrosion, deforma-
tions, heat exchange, icing and others. All these procedures will be done
under challenging environmental conditions which put high requirements on
design of UAV and choice of sensors.
Based on these factors, we can derive the following key scenario elements
(impacts):

ˆ Open space  as sensing being performed in open sea

22
Figure 8: Hull Structure [4]

ˆ Aggressive environment  low temperatures, strong wind, saltwater

splashes, icing (on drone)

ˆ Complex path/sensing planning  requires high number of waypoints

with specic camera/sensor tilt angle, distance from the objects, hov-
ering during some period is required

ˆ Movable platform  vessels can experience the motion in 6 degrees of

freedom: rotational (around yaw, pitch, roll axis) and translational


motion (heave, sway, surge), also when anchored.

ˆ Hazardous explosive atmospheres

ˆ Prolonged loss of light at wintertime

In addition to a harsh environment, drones can by aected by high-


temperature surfaces of the processing structures, which can have tempera-

ture up to 400 C [35].

23
For the outer inspections, drones will form a high-level technological base
that is capable to provide enough stability and reliability to succeed the
inspection tasks.
According to requirements described in [64][47][63], we get the following
overview (Table 4) of the critical structure areas and the deformations that
are expected to be representative there. This table does not contain informa-
tion about all possible degradation factors and structures on the vessel, but
that are most relevant and feasible to be detected under an UAV inspection.

Table 4: Exterior inspection: structure components and expected weaknesses

Structure elements Expected deformations/weaknesses

Hull Wear, erosion, corrosion, cracks, heat exchange,


welding defects, buckling, holes
Processing equipment Icing, heat exchange, gas/liquid/oils leaks
Turret/transfer system Corrosion, icing, heat exchange, gas/liquid/oil leaks
Flare boom Icing, rust, buckling, cracks
Deck cranes Icing, rust, buckling cracks

Scenario 1.2: Interior Inspection


The main goal for the interior inspection is the continuously updated con-
dition status of the oil and gas tanks. These high-volume storage tanks
present a wide area with signicant heights (up to 30 m). While drones
will not suer from severe sea weather and a strong wind, it will face the
demand for precise automated navigation and orientation techniques. Being
surrounded by thick metal constructions, GPS-based navigation system is
not available. The use of optical sensors can be challenging because of poor
or non-light, same as implementation of laser-based solutions can be dicult
as any uniquely shape that can be matched across the scans are not present.
Beside of oil storage tanks, vessel is equipped with ballast tanks that are
also interest for inspection. These tanks carry water to provide stability for
the vessel. Their inspection is relevant for overall structure integrity, because
of highly corrosive eect of salted seawater. At the same time, empty tanks
are also aected by damped air which also increase corrosion impact on the
tank surface. Unlike wide area oil tanks, ballast tanks are usually presented
as cluttered environments with a lot of obstacles.
Briey, we can set up the following key features of the inner inspection
scenario like:

ˆ Enclosed space

24
ˆ Complex navigation conditions

7
ˆ Man-hole sized , single entry points (hatches)

ˆ Hazardous environment

ˆ Impossible use of magnetometer-based sensors

ˆ Cluttered environment of ballast tanks

During inspection of storage tanks, our main interest is to follow the condi-
tion of tank's coating since it is their main protection solution. It is also of
interest to check the welding joints, as same as level of vibration of the shell
plates placed near machine/engine rooms. General defects that can occur
inside the tanks are similar to that what aects the hull: corrosion (most
common), deformation and fracture.
Due to clattered environments, especially in ballast tanks, it is desired
to have a collision-tolerant drone. Possible solution is to use multirotors
equipped with specic protective cage, as for example been mounted on
Flyability Elios drone [66](Figure 9).
It is also important to remember that an ATEX certied drone is needed
for inspection of tanks containing hydrocarbons. While there are no such
drones developed, tanks need to be prepared for inspection. In other words,
they need to be empty and approved as "safe" prior to each inspection. It also
necessary to empty ballast tanks if we want them to be inspected by UAV.
Instead of UAVs we could use autonomous underwater vehicles (AUVs), so
they can perform inspection without draining water. But in this case, re-
sulting visual-based inspection quality will be reduced due to lthy waters.
Use of unmanned ground vehicles (UGV) does not seem possible due to their
obstacle-overcoming limitations in cluttered and conned environment.

3.2 Use-case 2: Environment monitoring


While inspection of structures and equipment under normal operations is
related to standard external inspection scenario, here we are interested to
detect any environmental accident (in other words leakage). It can be in
form of oil spills, gas or other chemical substance leakages. The goal of this
scenario is to get a warning at the moment leakage has been detected. This
scenario is meant to be performed during daily normal operations, so we do
not expect any specic impacts (as for example high temperatures from the

7
Most common: from 300x300 mm to 1200x1200 mm for square/rectangular and 450-
600 mm for circular covers [65]

25
Figure 9: Collision tolerant Flyability Elios drone [5]

ames). Hence, the scenario key features are the same as for Scenario 1.1
"External Inspection", except possibility that the drone can get impact from
the chemical suspension in case of leakage.
To detect oil spills we can use already installed video and infrared cam-
eras. Some studies show satisfying result on using IR images and computer
vision for its detection [67][68]. Tests shows that use of short-wave infrared
(SWIR) band (10001700 nm) is highly eective to detect spills in low light
environments [69]. During operations following gases can be present:

ˆ volatile hydrocarbons emitted from crude oil during dierent process-

ing stages

ˆ vaporized hydrocarbons, as for example:

 natural gas (consist of methane, ethane, propane, butane)

 gas compounds, as for example benzene

Crude oil usually have a certain amount of natural gas where methane is
the primary component (87 - 96%) [70][71]. So, instead of several dierent
sensors for each specic gas, we can have only one for methane detection.
There are some types of sensors that are suitable for its detection, with
the most common non-dispersive IR and laser-based sensors [72]. The main
dierence between them is that IR camera works as "area" detector, so we
detect leakages on an entire image (camera's eld of view), while laser is
suitable for spot detection on selected object (gas pipe, valves, etc). In our
case, sensing all possible pipes, valves or ttings will take a lot of time, and
most of these pipes will be unreachable for laser beam or camera due to

26
coverage by other pipes or mechanisms. Using a camera with a wide eld of
view is thus more reliable than single spot sensing.

3.3 Use-case 3: Safety


Scenario 3.1: Scene monitoring
While the rst two scenarios are pre-planned and performed during nor-
mal daily operations, this scenario starts in the case of emergency situations.
The main idea is to assist an emergency response team (ERT) with scene
monitoring. The core meaning is to deploy all suitable drones and get image
of the scene within minutes after an alarm goes o. Using thermal cam-
eras, we can detect re hotspots, monitor the re's spread (also inside of the
vessel) and be able to track crew members through the smoke.
The scenario key features are the same as for Exterior inspection, with
several additions:

ˆ more aggressive environment  due to high temperatures in case of re

or explosion

ˆ extended airtime

ˆ possible degradation of navigation capabilities  due to partial or total

loss of local navigation equipment on board the vessel

ˆ possible degradation or loss of power supply infrastructure (charging

possibilities)

To prevent loss of communication, additional GNSS receivers can be


installed on UAV. Also, reserve channels for telecommand data transmission
from the vessel is recommended. This is to have an eective control and
safely land the drones also in critical situations. It should contain data of
vessel's inertial navigation system, that allows to steer the orientation of the
drone relative to the ship.
This specic scenario can be challenging to perform in fully autonomous
mode. Most likely it will be performed in semi-autonomous mode when
operator manually denes positions and orientation of drones by request
from ERT captain.
Scenario 3.2: "Man overboard"
In this scenario we are aiming on safety of the crew members who acci-
dentally falls overboard. So, the goal is to use drones to locate the person
in water and transmit this information to the rescue team. In this situation

27
each seconds counts due to exposure to low water temperatures, which can
cause hypothermia. At the same time, especially during stormy weather,
person can be drifted away by wind and ocean current. This situation can
be deteriorated by night time.
Unlike the inspection or maintenance scenarios which are predened, we
cannot predict this kind of situation. To be able to adequately react to
changed inputs, we might implement the priority of the scenarios (events),
in a such way that "Safety always comes rst", see Table 5.

Table 5: Task priorities

High - emergency situation, risk for crew


- emergency situation, no risk for crew
- environment-related incidents (spills and
leakages), no potential risk for crew
Low - ordinary inspection

The scenario key features are also the same as for exterior inspection one.
This scenario can have two variants:

1. when we already know that someone is missing;

2. when we do not know it yet

These variants involves dierent approaches. In the rst situation we need


to localize the person, which requires active search. In this case we can
use infrared imaging with image processing to be able to nd the person.
Combination of thermal imaging and computer processing is a widely used
solution for maritime search and rescue operations. There are proposed
eective solutions for implementation of such IR sensing in [73][74]. For
successful execution of the second variant, additional technologies must be
implemented. To get notication immediately after falling, we can use radio
distress beacons integrated in uniform or into life jackets (which are pre-
ferred). When the beacon comes in contact with water it starts to transmit
the distress signal, so rescue operation can start immediately. To be able to
save a time locating a person, additional GPS antenna that will transmit a
coordinates can be also integrated into the life vest. Such solution is used
in Cospas-Sarsat satellite system [75]. It shows impressive results in real-life
conditions. A drawback of the satellite-based solution is that it takes some
time the alert is activated. In our situation we have ability to reduce that
dead time to few seconds by transmitting distress signal directly to local
receiving equipment.

28
3.4 Use-case 4: Maintenance. Use of Aerial Manipulators
Scenario 4.1: Use of Aerial Manipulators
To expand inspection opportunities and be able to perform maintenance
tasks, the aerial manipulators can be installed on the drones. This will
provide the possibility to perform dierent sensing tasks in hard-to-reach
areas. For this specic task, the series-connected multi-DOF arm attached
to a drone's frame is preferred. It will allow free use of end eector without
need for additional compensation of moving limitations by drone. That will
give us possibility to use dierent end-eectors without changing the whole
arm, nor use drones that are dedicated only to one specic task.
There exist several important limitations today,where the rst one is
high power consumption. It is related not only to amount of energy needed
by end eector, bit also power required by drone to counteract the moving
center of mass [76]. Adding additional batteries will not solve that problem,
because power consumption will also increases with increasing total weight.
So, there is demand on new types of light-weighted batteries with higher
energy density. The second limitation is complex modeling and control [77].
The main idea is that drone positions as an anchor for the manipulator,
in the same way as ground-xed base, in term of reaction forces. To solve
that problem, we could use drones that can stick to the walls using e.g.,
electromagnets. To overcome resulted increasing of power consumption we
could use tethers, but it will reduce the drone's operational area.
Based on these drawbacks, we can see that there are not many possibili-
ties to implement drone-based maintenance or repair system today. There is
only few possible application ways how to use aerial manipulators: "extender
arm" for sensors or grippers and to lift some light-weighted items.
Scenario 4.2: Maintenance assistant
Even though we are not able to use drones in repair tasks, they can
serve to help the personnel during maintenance operations and with some
small tasks. Drones can be used as external light source, tools holder and
transportation of items within the vessel. Additionally, sensing drones can
be used on-demand by maintenance team. For this scenario it is necessary
that drones are equipped with additional sensors for obstacle recognition
and highly accurate object tracking algorithms must be implemented, due
to close ying/hovering to the personnel and structures. For extra safety,
propeller guards can be installed. There should be possibility for the drone
operator to quickly and in safe manner move the drone away from the person
in case of any unpredicted situation.
It is also desired that this type of scenario will be performed in last stages

29
of the automation, because operating in short distance from people can pose
a danger.
Scenario 4.3: Anti-icing
Another possible way to use drones is to perform deicing or anti-icing of
the vessel. Sea spray is the main reason for icing occurrence on platforms
or vessels and is one of the major hazards in cold regions [78]. Unfortu-
nately we will not be able to crush ice that already occurred on the vessel or
structures, see Figure 10. Even today the most common way to remove ice
from decks and structures is by using man power and shovels, wooden bats
or hammers. Seas spray could be also avoided by heading or maneuvering
downwind. Unfortunately, this solution is not applicable to FPSO vessels,
because they are anchored and tend to head upwind to keep the ame at
are boom away from the ship. Because of that positioning, the most likely
areas for ice accretion are bow side, helicopter pad, front sides superstruc-
ture, lifeboats, upper parts of turret and air boom and cranes. Another
techniques for deicing are use of chemicals agents and deck heating (elec-
trothermal) elements [79]. Use of such agents can be also challenging due to
its possible impact on environment and as they can cause metal corrosion
vessel's structure. Beside that, use of drones to minimize the ice accretion
by anti-icing operations (in other words spray chemical agents in prior to
icing) seems feasible. There are already been used tethered drones to clean,
de-ice and apply coating on wind turbines by Aerones Drone Solutions [80]

Figure 10: Sea spray icing on ships [6]

Similar solution for spraying is implemented by DJI on their agriculture


drones, as for example MG-1 octocopter [81]. It is designed for agriculture
needs, but main principle can be used for spraying anti-icing agents. Accord-
ing to its manual, it can carry up to 10 kg liquid, has spray width 4-6m and
spray rate 0.43 L/min [82]. Unlike the Aerones' solutions, drone from DJI
has onboard tank and batteries, so is not aected by tether limitations. It

30
shows that implementation of "spraying" drone is feasible and can be used
in this project.
The overall scenario key features are similar to those for Exterior inspec-
tion, but with several additions:

ˆ ying and/or hovering close to constructions and personnel  need for

additional and precise obstacle avoidance sensors and algorithms

ˆ drones or supporting infrastructure can be damaged by falling ice 

detection of the ice have to be a part of exterior inspection use-case

31
4 Concept

4.1 Choice of suitable drones


After relevant requirements and use-case scenarios were set up, we can choose
relevant types of drones for the inspection system.
There are dierent approaches of UAV's classication [83]. This cong-
uration variety is based on all spectrum of platforms and missions. Today,
the most used way to dierentiate the drones, is by their aerodynamics and
maximum weight, which simplied version shown in Figure 11. There are
many more types of drones, such as apping wing, "smart dust" (insect-
scale PAV), taxidermy bio-drones and others [83]. They are not included
into that diagram, because most of them are designed for specic use (such
as unmanned combat aerial vehicles), research platforms or other miscella-
neous applications (as police interceptor drones). So, they are not applicable
for maritime inspections. Other classication methods are based on dierent
drone parameters, such as dedicated application, ight altitude, range, en-
durance and motor energy, as well as on their dierent combinations [84][85].

Figure 11: Classication of UAV based on aerodynamics and weight

The most widely used drones nowadays are xed-wing and multicopters.
Advantages of the xed-wing UAVs is that they can y for a prolonged

32
period and cover wider areas than multirotors, since we do not need to
constantly generate lift in addition to drag force. Another big advantage
is their robustness in case of motor stalling - wings will still allow them to
glide and perform a safe landing. A challenge is implementation of take-o
and landing solutions. For the launching we will need a large and open area
without obstacles. Use of hand-launching could be a possibility, but then we
will bring another problem - to be comfortably operational by hands, drone
have to be lightweight. However, use of lightweight drones means stronger
restrictions for ying in windy conditions and additional weight limitation
for payload. Possible solution could be use of the catapult launcher. It
takes small place and reduces chance of human error, but it will not solve
the landing problem. There are some approaches for landing the xed-wing
UAVs [86]. It could be the already mentioned runway, belly landing, net
recovery, parachute recovery system or deep stall landing. There was also
presented an applicable approach of net recovery for maritime use using
suspension by two multirotor UAVs [87]. Unfortunately, all of these methods,
except of runway, are not sucient for our use, because under each landing
drone and sensing equipment (payload) will suer from a shock impacts,
which will shorten their life span and will cause misalignment of sensors.
Additionally, xed-wing drones can not hover. So we can use them only for
overall video inspection from a distance, without any possibility for specic
spot inspection. In this project, they could be used when prolonged ight
is needed, for example if we need to seek after a person in water or for
operational monitoring of oil spills. There exist a hybrid version of a xed-
wing UAV - VTOL (vertical take-o and landing). It allows the xed-wing
drones (Figure 12-a) to take-o and land on a spot without need of a runway.
They are designed in few ways: tiltrotors and additional vertical propulsion
system (it has rotors that are dedicated to generate lift, see Figure 12-b)
The takeo-landing problem could be solved by use of helicopters. Un-
like the xed-wing UAVs they can hover and at the same time have longer
endurance than multirotors. Large rotor blades give higher payload capacity
and greater ight range. Another positive side of helicopters is that they
are easier to stabilize than quadrotors. Using variable-pitch blades gives the
possibility to eectively counter-react external impacts and stabilize drone
or change thrust much faster, as we do not need to add delay-time due to
inertia of four or more motors. Additionally, it helps to save energy, espe-
cially while hovering. So, if we would like to transport heavy loads on long
distances, the helicopter UAV is a perfect solution.
Unfortunately, one of their disadvantages comes from their advantage. To
be able to steer pitch and roll angles, a mechanism called swashplate is used

33
(a)

(b)

Figure 12: (a)Tiltrotor [7] and (b)hybrid xed-wing UAVs [8]

(Figure 13), which is complex both in construction and maintenance. While


we need only simple electric stabilization units (electronic speed controllers)
connected to each motor to steer the same angles of rotation on a multirotor.
Since also we have only one main rotor, in case if motor stalls or blade
damaging, we will not be able to safely land the drone without causing any
other damages. Additionally, bigger blades can inict harm on personnel,
cause re or damage dierent mechanisms. These are reasons why we do
not want to use helicopters for the close-up inspections, while they could be
used for freight transportation if needed.
Unlike helicopters, multirotors (3+ rotors) are usually equipped with
xed-pitch blades and uses controllers that control rotational speed on each
motor. Changing speed separately on each motor gives us possibility to
control the thrust and torque of motors, thus steer the multirotor. It gives
cheaper operations and maintenance with increasing reliability, as we avoid
use of complicate mechanisms. Using the blades of smaller diameter also
reduces harm of possible damage, thus increases safety. Another advantage
of using only electrical motors for ying, is that they are the only moving
parts (except of camera's gimbal, but it is not dedicated for ight control)
on the drone, as we do not have swashplates, ailerons and others - so the
probability of failure is lower. On the other side we have reduced ight time
and lower payload capabilities [88].

34
Figure 13: Helicopter swashplate setup

Multicopters can have various congurations, but the most common are
2-blade puller, quad-/hexa-/octocopter [88][89]. Even number of motors is
preferred, because it gives balanced torque. Using odd number of motors
(e.g. tricopter) will require a tilting mechanism on one of the motors in
order to balance the torques [90]. Main advantages of using more motors, like
comparison of octocopter to quadcopter, is increased payload capacity, higher
speed and possibility to tolerate failure of several motors. Unlike quadrotors,
hexa- and octocopters can still hover, y and perform safe landing with up
to two stalled motors, while quadrotor can become uncontrollable with a
single motor failure if proper control laws are not implemented. There are
proposed dierent approaches to make a controlled landing, such as PID-
based approach [91], cascaded control method [92], nonlinear H∞ control
loop sharing technique [93] and T3 mechanism [94]. Anyway, while some of
these techniques shows sucient results, we are not able to fully control the
attitude of a quadrotor if one of the motors stalls because it can suer from
uncontrolled spins about its yaw axis [95]. It can cause unwanted damages,
thus use of hexacopters is more preferable.
The choice of drone type and conguration is always based on mission
denition and environmental eects. Besides, it is desirable to have similar
types of drones to be able to reduce operational and maintenance costs. Since
we will perform a marine inspection operation in complex weather conditions,
the hexacopter conguration is recommended for the exterior inspections.
Use of xed-wing drones would be preferred for emergency scenarios, when
prolonged ight is needed, but it will require installation, or having available
for fast deployment, of additional supply infrastructure.
Due to limitations on performing automated inspections of cargo holds
and ballast tanks, we will not be able to implement automatic y-in into

35
them. Also due to clattered environment of ballast tanks we will not be able
to use big and complex inspection drones that will be used for outer hull
inspections. Therefore there are few ways for drone assignments:

ˆ We will have drones for outer inspections only in addition to drones,

dedicated for inner inspections (cargo holds and ballast tanks).

ˆ Or we can use the outer drones also for inspection of cargo holds (so

long they can y through manhole size hatches) and have drones ded-
icated for ballast tanks inspection only.

It is important to keep in mind, that inspection of ballast tanks is only


possible when they are emptied. To be able to inspect these tanks while there
is water inside, additional unmanned underwater vehicles (UUV) could be
integrated into inspection system. This will allow us to unify UAVs and save
time by skipping emptying process.

4.2 Inspection techniques


As it was set up in use-case scenarios, there will be performed few types
of sensing: visual and contact. Both techniques are being parts of an non-
destructive testing (NDT) methods, widely used in civil engineering struc-
tures, as they allow to validate the properties, detect internal and surface
defects of steel materials or welds without damaging them. While most
of these damages can be detected by simple visual approaches, the use of
contact methods can give more precise results.
Unfortunately, due to limitations of the UAV platform (such as payload
capacity, limited power supply, short ight time) we cannot use the wide
specter of NDT methods.
The most common techniques that are used to detect cracks, weld defects,
corrosion and measure thickness of metals are [96][97] visual and ultrasonic.
Visual:
Inexpensive, and common type of NDT. Usually, it does not require ad-
ditional equipment and can be done just by naked eye. In our case we will
use it with remote visual inspection (RVI) tools (e.g. camera) installed on
a drone. With all its simplicity, this method has its drawbacks and disad-
vantages. Only visible defects can be found, and requires good and correct
lighting. On the other side, image processing techniques will give us an op-
portunity to detect cracks and deformations on their initial stages, such that
we can prevent damages and failures. Also, mapping of discovered weak-
nesses and degradation prior to annual inspection will allow optimization of

36
the repairing process which will reduce dry-docking time, thus costs. Accord-
ing to the set up of use-case scenarios, we will need a few types of cameras:
ordinary video, infrared SWIR (short-wave infrared) for oil spill and ND-IR
(nondispersive infrared ) gas leak detection.
Corrosion detection is one of the important parts of the ship's inspection
process [47]. There are several types of corrosion, but most dangerous is
pitting since it is hard to predict and detect [98]. For its detection, several
methods can be used, as for example magnetic ux leakage and ultrasonic
testing [99], but most of these tests are expensive, have limitations to ac-
cess introspected areas and heavy. Also, in our case we have big areas of
metal plating (shell plating of hull and cargo tanks), so contact-based in-
spection will not be eective due to high time consumption. So, to be able
to quickly gather information we can use image processing techniques to de-
tect potentially rusted areas. Then, as a second step, use ultrasonic testing
to collect sensor probe data of suspicious points. Use of neural networks for
automatic detection of corrosion spots and cracks shows satisfactory results.
For example, LSHADE-SVC-PCD model (image processing based detection
of corrosion) [98] gives a classication accuracy rate (CAR) of 91.80%. Or
image texture analysis by MO-SVM-PCD model, which has achieved CAR
of 91.17% [99].
Because of dependence on proper light conditions, CAR can be lower
with a probability of false-positives, or small spots can be left undetected.
Still, as an alternative to time consuming contact-based detection systems,
visual-based image processing approach is promising.
Ultrasonic:
This method is based on use of high frequency sound waves and uses to
measure thickness of metal plating. The most common variant is pulse-echo
detection. Here the sound waves are sent into the material and reected
echo produced by defects is then being detected by sensor. By this type
of testing, we can detect hidden cracks, welding defects (most common is
incomplete fusion) on target spots. One of the advantages of that method
is that we require access to only one side of metal component to be able to
measure thickness. This test can also help to conrm or deny the suspicious
corrosion spots discovered by image processing. According to [47], thickness
being measured during each Special and Intermediate inspection.
Other methods, such as powerful ultrasonic guided waves approach, which
is widely used for corrosion and crack detection in pipes, could allow to in-
spect the whole area of cargo holds/ballast tanks plating [100]. It requires
installation of additional transducers, so it cannot be installed on and oper-
ated by a UAV.

37
Figure 14: Simplied design of couplant supply system

One of the disadvantages we can face up with when using this type of
sensing on a drone is that it can give false positive echoes if the sensor is
not aligned properly to the inspected surface. We also need to keep in mind
that air is not an eective conductor of sound waves in megahertz range. So
additional couplant between sensor tip and test surface, such as propylene
glycol or gel, is required [101]. To be able to provide couplant on sensor
surface, an additional mechanism should be installed on the drone, example
can be seen Figure 14.
So far, the most feasible and cheapest NDT solution, that can be in-
stalled on an airborne unit, is visual-based image processing inspections. It
is because their sensors have small coverage area or requires installation of
additional equipment (as transducers or mercury lamps) on vessel. There
are also needs for long lasting ight time and more aggressive control to be
able to keep the drone in one position relative to the vessel. So, the main
possibility to use contact-based inspection techniques is to use them to in-
spect doubtful spots, detected by image processing, or for pre-planned check
of critical structural areas [96].
8
4.3 Frames of reference (coordinates)
To dene the attitude of inspection drones and create mathematical model
for simulations and path calculations, it is important to set up the reference
frames. It will also help to visualize data ow and specify which components
of supply infrastructure are needed.

8
Critical Structural Areas are locations which have been identied from calcula-
tions to require monitoring or from the service history of the subject ship or from
similar ships or sister ships, if applicable, to be sensitive to cracking, buckling or
corrosion which would impair the structural integrity of the ship.

38
Figure 15: Frames of reference ( objects are not in the same scale )

The operational environment is an oshore aerial inspection of a vessel.


So, it is our primary object of interest, thus all intentions performed by
drones must be done according to its position and orientation. One of the
key points here is to dene position of the vessel's body-frame origin, because
there usually three frames that are considered for marine vessels [102]. One
of the FPSO's features is that they are anchored by use of turret mooring

system, such that the vessel can freely rotate 360 around its pivot point,
which is the turret. To be able to calculate motion of the drones around vessel
we need two frames  body-xed frame of the vessel itself and body-xed
frame of the drone. To minimize the computation and attitude determination
errors (which can potentially lead to crashes), we will use the relation of the
body frame of drone to the body frame of FPSO instead of NED-frame
(North East Down) (Figure 15). Usually, origin of the vessel's body-xed
frame located in the CG (center fo gravity), such that its axes coincide with
the principal axes of inertia to simplify equations of the motion. In our case,
it is easier to "move" the origin, so the z-axis coincide with the center line
of the turret.

39
Figure 16: Objectives tree

4.4 Concept Denition


Before we go to concept architecture and structure of its components, it is
important to summarize the key features and terms for the entire system.
Due to lack of the required regulations, we will base on the existing infor-
mation/regulations, allowing to make some assumptions.
Objectives that are planned (use-case scenarios) and basic standard re-
quirements in schematic form can be seen in Figure 16.
Since we want to implement this automated inspection system to enhance
already existing inspection techniques and reduce risk of injuries, we need to
set up the requirements that should be met [22][103]:

1. The results of remote inspections, that are normally manually acquired


by surveyor, should be on the same level of detail or better;

2. System should be operational 24/7, with respect to weather conditions,


safety

3. Sensing data should be obtained, transmitted and stored (archive) in


a safe manner;

4. Acquired data (images, videos, sensing data) should be available to the


end user (surveyor) for presentation and other use (processing)

40
5. inspection data that contain personal data, should not be collected nor
used contrary to existing data protection and privacy regulations (e.g
General Data Protection Regulation);

6. Inspections, data interpretation, drone operations (control) and their


maintenance should be done by qualied personnel

7. inspection data should be marked such that it is always possible to


determine place and time it was collected;

The term "operational 24/7" means that drones are ready to be deployed
in few minutes, or so long it takes to create new or update existing ight
plan, in case of emergency. Since there are limitations today that make it
problematic to perform automated inspection of the cargo holds, this type
of inspection will be performed in manual (semi-manual) mode at the begin-
ning. Considering that the main idea is to get the fully automated system,
for now we will discuss intentions and set-ups related only to "outdoor"
ights (use-case scenarios). When the technological and regulatory gaps will
be covered, the interior use-cases can be easily integrated into the system.

4.5 Preferred System Architecture


A fundamental setup of each manual or autopilot mission architecture can
be seen in Figure 17. It consists of three subsystems: Mission Denition
System (MDS), Pilot App, Drone. Despite the fact that the goal is to have
fully autonomous system, we include the possibility for pilot to take manual
control or retarget drones without causing their "disorientation".
Mission Denition System:
First, we will discuss how we can set up missions that can be pre-planned, i.e.
exterior inspection and environmental monitoring scenarios. Each use-case
scenario consists of relative sub-scenarios. For example, exterior inspection
is dened as set of smaller (sub-scenario) missions, as can be seen in Figures
18 and 19.
To optimize the calculation of drone's ight logic, each sub-mission is
represented by a set of waypoints. A single waypoint is a 3D coordinate,
which drone must reach. These waypoints (as single point or set of few)
represents a point of interest (POI), where drone will perform some type of
inspection. For the pre-dened scenarios, each waypoint (or resulting path)
also contains details about drone's intentions. This information describes
orientation of drone/gimbal, cruising speed, hovering period, etc. All these

41
Figure 17: System setup [9]

setups are stored in the "Scenario repository" and can be accessed by opera-
tor/user via "human-machine interface" (HMI). It will allow to add or make
changes in missions.
Based on the inspection schedule or by demand, we manually (or au-
tomatically) pick up some of these POIs. List of these points presents the
scenario for the specic day. After this list is generated, it passes to the
Mission Calculation Engine. This engine is the core of the entire system and
plays key role in overall success. Here, the ight and measurement plans
being calculated with respect to mission specications, expected drone's dy-
namics, weather conditions, number of involved drones, etc (Figure 20 [9]).
When ight plan is calculated, it is being sent to mission repository, which
stores all information related to the specic mission, to the pilot app (to be
accessed in case of manual ight or interruption during autonomous ight,
if needed) and to the drone(s).
Such reports need to be created and approved at least 12 hours before
any ight intentions for each day, when there are planned drone operations
[52]. This plan should include:

ˆ Purpose of the ight

ˆ Risk assessment

42
Figure 18: Setup of Mission Repository

Figure 19: Vessel structures that are of interest for inspection

ˆ Assessment of if additional necessary equipment or actions are needed

(placing barriers, using of observers/assistants)

ˆ List of instances that needs to be informed about operations:

 Operational manager
9
 HLO

 HFIS

 all instances/vessels equipped with helicopter (ight) deck within


5NM (9,26km) radius
9
In Norwegian: Operasjonelt systemansvarlig

43
ˆ Overview over areas where ight will be performed:

 Flight path

 Take-o and landing points

 Altitudes

ˆ Overview of simultaneous activities (e.g. cranes, helicopter trac)

ˆ Handling (intentions) of the drone(s) in case of alarm situation

Also, based on the fact that all inspection actions are pre-planned, upon
the requirements from maritime organizations, internal regulations or man-
ufacture's, ight plans do not need to be set up before each ight. On later
stages of the automation, these plans may be created and approved auto-
matically, using articial intelligence. After approval, warning massage will
be distributed automatically to the relevant instances (see above). All these
plans should be stored in "Mission repository" and be accessible by end user.
After ights are completed, repository should be updated with all collected
data, so all information related to specic mission is stored together in one
place.
Mission Calculation Engine
This is the core of the whole system. We can say that is the most com-
plicated structure as it should calculate the desired ight plan based on
multiple input variables. First, mission specications (list of intentions) are
converted into the ight plan with respect to inspection patterns. These
patterns present sets of instructions or rules which denes path of a drone
according to area or object of interest and specications of sensing equip-
ment. This ight plan is a set of waypoints relevant for the specic inspec-
tion mission. After that, this plan being divided into set of the instructions
- scripts, that will be executed by drone(s). They also dene the behav-
ior of the drone that is needed to perform necessary measurements, such as
hovering on specic coordinates with required orientation (like yaw angle)
during time interval that is needed. Further, this set of instructions is sent to
Trajectory Integration unit, which calculates the trajectory with respect to
presenting weather conditions (WM-wind model), drone's aerodynamic and
physical model (APM-aircraft performance model). This simulated trajec-
tory is relevant for visualization of the drone's intentions. Such simulation
not just ease the planning of inspections, make all involved actors to share a
common view, but also allows to make a visibility/safety tests. Furthermore,
it will provide relevant information, needed to ll into the ight plan report.

44
Figure 20: Mission Calculation Engine[9]

Finally, the ight plan and calculated trajectory are being uploaded to
the drone before the ight.
Pilot App
Pilot App provides all necessary information and specications of the mis-
sion that are needed by pilot to perform a ight and/or take measurements
manually. It will also allow to interrupt the drones' intentions and take
manual control over it or change the mission. These missions are stored and
accessible from the "Flight mission repository". Since signicant changes
will need change of the ight plan
10, it contains set of rules of allowed inter-
ruptions in the scenario. For example, if the drone performs inspection of a
specic point, but we want to inspect the spot few meters away (what could
be allowed in this situation), we can add that specic waypoint to the path.
This also means that path planning algorithm should be able to recalculate
the path "on a y".
Taking into account that while emergency situations can be similar, they
can vary in "screenplay". So, automated set up of the drones' ight path can
be complicated or even impossible in some situations. In a critical situation
each second counts, so the time reduction between sending a request from
emergency response team (ERT) and drone action is important. In a highly
automated system, we can reduce the chain of supervising and executing to
a minimum by allowing the ERT captain (or assigned team member with

10
what is strongly not recommended because in worst case it will lead to mission abortion

45
relevant skills) manually guide the drones via "pilot app" (or ERT alterna-
tive). In case of a situation when there are no dened waypoints, that person
can manually dene script: choose drone(s) and "draw" a path and desired
viewing angle of cameras. These data can be entered via individual tablet
or PDA. This feature can be part (integrated) into the general emergency
response system. It will allow fast sharing of relevant information between
participants (response team(s) and their members), observers and external
actors. Similar approach being used for reghting [104] and as infantry
combat system in military [105].
In case of such interruptions and manual control, additional software
solutions have to implemented. Its role is to automatically control drone(s)
between manual inputs (when UAVs are "standby"), automatically reassign
tasks between if one of them needs recharging or loses control. Because of its
complexity, we can not fully relay on simple "if-else" statements, so decision
making and executing AI capabilities needs to be implemented.
Drone interface system
The third subsystem is the drone interface system. The ight plan's in-
structions being passed to Flight Control System (FCS) and converted to
formats that are understandable by drone's onboard computer. Based on
the mission requirements, Measurement control unit is responsible to take
measurements at desired positions. Navigation system should not only pro-
vide solution for navigation, but also ensure time-stepping of the sensing
samples to georeference obtained data.

4.6 Concept Exploration


In this section we will discuss overall setup and data ow which is shown in
Figure 21.
Idea is to get a fully autonomous system, that will act independently
from operator. To achieve that it is necessary to nd set up the ow chart
that will describe the general workow. Proposed solution can be seen in
Figure 21. It consists of several loops: initial (preight or standby), ight
plan calculation "side loop", scheduled task execution and emergency.
In ight plan the mission for a day being calculated as been described
earlier. It is based on list of scheduled, manually entered or those that are
remaining from previously interrupted (because of weather change, emer-
gency, or other reason) inspections. During preight, drones are stationed
on their landing pads in standby mode and are ready for takeo in case of
emergency or in manual (after ight plan is uploaded) or automatic mode.
At highly autonomous setup the manual mode is planned to be used only in

46
Figure 21: Overall owchart
47
cases when unforeseen situation occurs, when drones are already airborne.
During emergency or if any other path corrections are needed, they will be
entered "graphically" or by choosing from a predened list. In other words,
pure manual control is reduced to a minimum. This loop is divided into
three sub-loops: weather check, emergency awaiting and check for inspection
schedule. If weather is bad (e.g.it exceeds drone's limitations) there will be
no ights anyway, so this is a rst check that might be done. When weather
is good, drones are staying alert for possible accident. If something hap-
pens, it "activates" emergency protocols - uploading of "emergency scripts"
to drones, then takeo of all required drones and further work under ERT
control. If there is no emergency, we check for inspection schedule. If there
is planned inspection intentions, mission plan being uploaded to drones, if
there is nothing for a specic day - UAVs continue in standby mode. These
emergency protocols (list of intentions) can be also stored in drone's internal
memory in case of communication degradation.
After plans are being uploaded, UAVs perform actions specied by mis-
sion's instructions, while continue monitoring for critical situation. During
ight, collected data being transmitted to "ground station" where it being
processed in real time. This is done to get a warning about detected anomaly.
There is also needed to have a hierarchy of missions, so it would be
possible to automatically switch between mission on a y in a situation
when there is need to interrupt the task in progress with on that is more
important to be performed at a moment. Most naturally it will be situations
when we will need to initiate a "search and rescue" operation when some
inspection is in progress. If it occurs (especially when person is missing) -
all ongoing actions being aborting, rerouting according to emergency scripts.
To be able to continue inspections later from a breakpoint, it is desirable to
automatically register every drone's action in a ight log. When all planned
intentions are fullled, drones returning to their stationing positions and
performs after-ight actions - de-icing, recharging, drying and others.

4.7 Autonomy levels


In simple words, automation level describe the replacement degree of human
by computer. Its gradation is ranked from Low, where all no computer
decision making system is involved and everything is performed manually,
to High, where no interaction from operator's side is needed, as can be seen
in Table 6.
According to performed literature study, there does not exist any au-
tomated inspection of marine vessels solutions today. All actions are done

48
Table 6: Autonomy levels gradient [17]

Extremes Gradation level and description


Low 1. No assistant from computer, human takes all decisions and actions
2. Computer oers several alternatives (no decision making)
3. Computer narrows alternatives down to a few
4. Computer suggests recommended one alternative
5. Computer executes suggested alternative, if human approves it
6. Computer executes suggested alternative, human can veto
7. Computer executes suggested alternative and informs human
8. Computer executes alternative and informs human only if asked
9. Computer executes alternative and informs human only if it decides to
High 10. Computer decides and acts autonomously, ignoring human

manually, so we can say that oshore we are on level 1. There are some
projects that works on inspection of power lines and buildings where drones
ights along path dened by manually chosen waypoints. As we can see,
concept of autonomic inspection is quite new and lacks some techniques on
the market. So, the good starting point is in step-by-step automation, where
we have possibility to practically check eectivity of the implemented pro-
cesses and if needed x defects/bugs as same as simplify on a y. It can also
help to dene regulations, guidelines and outline further recommendations.
When we are talking about autonomy, we mean that the drones not only
performs ight on predened path, but do it based on the required set of
tasks and scripts. The ight plan being calculated automatically according to
predened set of rules and passed to the onboard data handler, then drone
performs takeo and proceeds along the desired ight trajectory making
relevant sensing. Collected data then being sent to the ground station and
processed there, what will give much faster computations. After fullling the
required tasks, drone returns to the base for charging and making necessary
procedures (prepares for the next ight). All processes happen without any
interruption from the operator's (user's) side. Operator's role in this case
is to monitor the overall status of the whole system, intervening only if
unintended situations occur or any extraordinary tasks should be performed
manually. Basic requirements for the autonomy of the system can be derived
as: "Automated:

ˆ scheduling,

ˆ path planning (3D path planning),

49
ˆ deployment,

ˆ sensing,

ˆ collision avoidance,

ˆ charging."

At rst stages we can start from automation of processing of collected


data and automate piloting algorithms later, which will allow us to x pos-
sible bugs. Otherwise, we can get situation, when ight logic will command
to move a drone to next waypoint while collected data had disturbances or
computer did not get enough time to transmit/process all data. When image
processing is in place, we can start to implement navigation along manually
predened waypoints and routes. It will allow to test navigation algorithms
and its combination with image processing. It is preferably to start with au-
tomation of outdoor ights rst, due to diculties in navigation inside cargo
holds and ballast tanks. Also, even though there exist solutions for semi-
auto and auto ight solutions onshore, there does not exist any autonomous
solution for marine operations. So, testing of control laws and equipment is
needed.
Automatic inspection of ballast tanks and cargo holds is set to last stages,
because for now it is not possible to implement automatic deployment (ight
from and back to landing pad) due to lack of regulations and existing tech-
nical limitations.
Summarized steps for autonomy implementation ("autonomy gradient")
can be seen in Table 7. Where by "Outdoor" we mean ights performed
outside of FPSO (external inspection) and by "Indoor" - inspection of cargo
holds/ballast tanks.
At rst stages it is better to have crew of two members: one is responsi-
ble for maintenance, technical assistance, piloting and another one who can
interpret collected inspection data, take care of sensors. At high autonomy
levels, almost all functions will be done by computer, so the role of operator
will be in monitoring ongoing processes and act only if computer will not
be able to resolve situations by itself. He will also have a responsibility for
maintenance of drones.

4.8 Landing pad design


Beside of the drones, landing pads will play an important role in maintaining
system performance. On the earliest stages there is no need for "own" base,

50
Table 7: Autonomy implementation gradient

Step Flight autonomy Inspection autonomy


1 All manual Transmission of inspection data
to computer with manual process-
ing
Automated image processing
2 " of optically collected data,
no use of manipulator
Outdoor: semi-auto "
3 (auto following of manually set up way-
points/routes)
Indoor: manual
Outdoor: Auto scheduling + same as in pkt 3
4 Indoor: Implementation of navigation algo- "
rithms.
manual "ight in"
Outdoor: auto deployment
5 Indoor: semi-auto (auto ying, bringing of "
drones inside tanks/holds) by hand
6 Implementation of drone control from ERT tablet "
Outdoor: fully auto ights for inspections same as pkt.1 plus
7 Indoor: Implementation automatic deployment use of aerial manipulators

since they are manually operated, operator can easily bring the drone with
him by hands. But on later stages, when we will implement automatic
operation of several drones, UAVs will need a separate landing and storing
solution with multitasking capabilities. This types of landing pads solutions
are called "drone in a box" (DIB), used for stationing of autonomous drones.
It is made if a form of a "box" where UAVs being stored and where basic
drone's maintenance is done. Beside of "just storing" and protecting drones
from external impacts (as for example weather), we want it to be able to
do simple service functions. The required minimum of such services is the
ability to recharge drones. To increase drone's maintainability and signi-
cantly reduce charging time, it will be better if there would used replaceable
batteries. It will denitely raise the problem of implementations of such
mechanism, especially with respect to drones positioning on that pad, and
not to forget ATEX problem, because there could easily occur sparks. But
when implemented it will let a drone to be ready for ight within seconds.
Another problem, that we will face with ying during cold periods - is icing
of a drone's blades. It is a signicant problem that leads to drag increasing
which results in reduced lift and maximum angle of attack (that is important
for speed control, stability controllability) [106]. So, additional system for
ice and moisture removing is also necessary. It can be outlined in two ways
- integrated into drone's structure or be a part of landing pad. For higher

51
productivity it is better to have both, so we can remove ice coating that is
critical for ight and secondly remove icing occurred on drone's parts which
is not covered by integrated solution.
For xed-wing aircrafts there are proposed several solutions of integrated
de-icing methods: electrothermal, mechanical (pneumatic) and chemical.
Pneumatic system is presented by inatable rubber boot on leading edge
surface of the wings (Figure 22-a). If icing is detected, boots being inated
with air causing breaking of ice. While being used on small aircraft (e.g.
Beechcraft King Air series [107]), they are not practical to be used on a
small UAV, because of their complexity and high weight. Another alterna-
tive is to use liquid anti-icing chemicals prior to ight. A drawback of this
method is that chemicals can potentially damage sensors or optics. Also,
eciency of the chemical agent can be reduced during ight as water can
dilute it. Use of electrothermal (Figure 22-b) looks promising as it is a highly
eective and lightweight solution that can mitigate risk of icing [11]. How-
ever, energy eciency of such thermal system is quite challenging, as it can
consume high amount of energy. One of the ways to reduce consumption
is to heat surface periodically (in cycles). This means that we will be able
to implement it on a xed-wing UAV, but not on multirotors. Instead of
installation of active deicing system on a small drone, which is challenging,
we can use combination of sensors and articial intelligence to detect ice ac-
cretion before it poses a danger. It can be formed by a subsystem that uses
atmospheric sensors (to "read" current weather conditions
11) in combination
with thermodynamic principle of the surfaces and continuous monitoring of
drone's aerodynamic behavior. Additionally, using icing conditions with cur-
rent weather and forecast we can "predict" accretion and therefore expand
ight plan with periodic landings for de-icing. All that means that we need
to have additional de-icing system integrated into the landing "box". Beside
of the capability to remove ice, wet snow, water, it should be powerful enough
to do it during short landings between ights. It will be also important to
remove ice quickly during emergency situations.
If several types of UAVs will be used (combinations of multirotors and
xed-wing), there will be a need to use dierent types of landing pads. Since
xed-wing drones will not be used for close-up inspections, but rather for
general overview and during emergency situations, there is no need to have
complex solution for its storing. It is also possible to say, that they are not
actually landing pads, but take-o pad (i.e catapult), which is not suitable

11
they can be used as "stand alone" icing detection system, but it will not have high
precision rate

52
(a)

(b)

Figure 22: (a) Inatable rubber boot [10] and (b) schematic layout of the
heating zones [11]

for landing. But for simplication and to avoid confusion, the term "landing
pad" (LP) will be used for both types of "storing boxes" for both types of
drones.
To minimize risks for damaging multirotors during take-os and land-
ings, self-balancing landing platforms can be used. Its role is to counteract
movement of FPSO to keep the landing zone (where touchdown and lift o
is performed, see Figure 23). Such a mechanism will increase the complexity
of the LP and it also can be dicult to seal the gap between touchdown zone
and housing, hence there is chance for salted seawater to penetrate inside
and damage electronics or servos.
Based on what has been discussed above, we can set up two types of
LP, see Table 8. Example of the possible LP for multirotors can be seen in
Figure 23. It will contain:

ˆ Anti-icing solution  to remove ice and snow from drones

ˆ Fireghting solution

ˆ Battery replacement arm  to change batteries

ˆ End eector cleaning solution  to remove couplant from sensors

ˆ Battery charging station and end-eector storing compartment  idea

is that end eector will be changed automatically according to mission

53
needs

ˆ Separate storing/charging compartment cover  to isolate section from

snow/water and insulate rechargeable batteries

Table 8: Landing pad specications

Drone type Landing pad description LP' integrated functions


Multirotor formed as "storing box", ca- Charging, anti-/de-icing, end-
pable for both take-o and eector cleaning and changing,
landing self-stabilizing platform
xed-wing catapult ⇒ launching only pre-ight charging and anti-
icing

Figure 23: Example of multirotor' landing pad (LP)

54
5 Framework

5.1 Fleet congurations


Based on specications of missions' and drones, it is now possible to propose
a setup of eet of drones. Use of number of drones of various type will allows
to do dierent tasks simultaneously, thus ensure continuous ow and cover-
age. Use one or two drones for all tasks is also possible, thus it will reduce
costs, but they will not provide sucient ow (continuity and coverage) of
inspections, in other words eectivity of such use will be low. Another factor
is energy consumption  having a single drone that will need to cover bigger
areas will result in non-optimal power consumption, due to raised amount of
deadhead=worthless ights. While swarm of drones will be able to cover
more tasks on the same battery capacity.
To be able to propose optimal minimum number of drones, we need to
see at frequencies of use-case scenarios (Table 9). As we can see, the most
common proposed scenarios are related to outdoor ights, what means that
we can focus mainly on them. Out from drones' classications, we can see
that for our needs we can use two types  xed-wing and multirotors, what
gives two possible main congurations:

ˆ Multirotors only

ˆ Combination of Multirotors and Fixed-wing

Due to high eciency in prolonged ights, xed wing can be used for search-
and-rescue operations and pollution monitoring. Due to low probability of
these missions, there is no need to have more than one of them. If case
there will be need to have additional monitoring drones, then we can use
mulitortors.
To reduce complexity of path and scheduling calculations we can use
the concept of one zone-one drone, which means that we divide vessel into
several zones where only one drone of specic type will operate at a time
(such that we will not have two, e.g., maintenance drone in one zone at
the same time). Natural way for such division is to use vessel's sides: Bow
SternStarboardPort (Figure 24). To assign specic multirotors (scout
or maintenance  see Table 9 -"Drone's functionality and specs.") to each
zone, we need to see what kind of tasks will be preferred in these areas. Due
to lower battery consumption of scout drones, they can be placed at Port- -
Starboard sides, and maintenance drones can be placed at BowStern sides
(Figure 25).

55
Table 9: Proposed regularity of tasks and drones that could be used

Regularity Mission (UC) Drone's functionality and Drone types


specs.
High Outdoor: over- "Scout" drones with optical Multirotors
all survey, pol- sensors and simple grippers
lution monitor- to transport lightweight ob-
ing jects
Outdoor: "Maintenance" drones with Multirotor
close-up survey aerial manipulator
Indoor: ballast Small collision tolerant Multirotors,
tanks (w/ protective cage) scout possibly under-
drones* water "snake"
robots
Low Indoor: cargo Collision tolerant: Multirotors
holds "Scout"(w/protective
cage) with optical sensors;
"Maintenance" (w/ pro-
peller guards) with aerial
manipulators and optical
sensors
Very Outdoor: Long ight time Fixed-wing,
rare** search and multirotors
rescue "Scout"
drones
*use of aerial manipulators not possible if protective cage being used
**these missions are not predictable, so they are out of "Regularity" classi-
cation, assuming them to be "very rare"

This conguration of total four (+1 eventual xed-wing) can be reduced


to only three drones  two scouts (takes Bow and Stern sides) and one
maintenance (which will cover both PortStarboard sides). There is pos-
sibility to make such reduction in number of maintenance drones, because
frequency of tasks related to contact-based samplings will be low because
vessel is newly built.
Further reduction in undesirable, because then it will not be possible
to optimally distribute tasks and we can end up in unbalance of costs
performance relation (drone will use more time to y back and forth for
recharging than for actual inspections).

56
Figure 24: Bow-Starboard-Port-Stern zoning

Figure 25: Example of drone conguration

Drones that will be used for indoor inspections stays a little bit aside
due to their characteristics  to be able to y inside ballast tanks and cargo
holds they needs to be in small size, because of restrictions of manhole-
sized hatches and what is impossible due to existing regulatory and technical
restrictions. What means that drones need to brought there. Additionally,
frequency of such tasks is very low, so there is no need for more than two
(or one, if use of protective cage will allow use of sensing end-eectors).
In view of the foregoing, we can set up the table of proposed optimal
eet conguration  Table 10.

5.2 Flight logistics


Flight logistics establish and describe motion and interaction of all involved
drones. Depending on how well it being organized, the overall eectiveness

57
Table 10: Proposed eet congurations

Multirotors' eet
Type of drone Operating area and function Minimal
quantity
Multirotor: Outdoor, visual inspections, pollu- 2
"Scout" tion monitoring, search-and-rescue
Multirotor: Outdoor, visual inspections, close- 1 (2 recom-
"Maintenance" up surveys (using aerial manipulator), mended)
transportations
Multirotor Indoor, visual inspection, close-up in- 1
spection (using propeller guard or pro-
tective cage that would allow use of
aerial manipulator)

Combined eet
Fixed-wing Outdoor, search-and-rescue and spill 1
monitoring
Multirotor: Outdoor, visual inspections, pollu- 2
"Scout" tion monitoring, search-and-rescue
Multirotor: Outdoor, visual inspections, close- 1 (2 recom-
"Maintenance" up surveys (using aerial manipulator), mended)
transportations
Multirotor Indoor, visual inspection, close-up in- 1
spection (using propeller guard or pro-
tective cage that would allow use of
aerial manipulator)

of the inspections and monitoring may vary. It consist of several parts:

ˆ Automatic scheduling - continuous operation and coverage

ˆ Path planning - calculating of optimal ight trajectory

ˆ Collision avoidance - safe ight

ˆ Navigation - provides point-to-point guidance information or position

data

58
5.2.1 Automatic scheduling
To ensure continuous ow of the inspections and monitoring, it is important
to solve the scheduling problem. It can be dened as a determination of an
optimal allocation of inspection and monitoring tasks to eet of drones, while
minimizing the overall costs, which consist of deadheading ights between
tasks. Implementation of scheduling is quite challenging because operation
of a UAV if often deviate form the schedule due to uncertainties [12].
Scheduling of inspection drones is very similar to scheduling public trans-
ports (PT), where they also need to deal with similar inputs and conditions
[108]:

ˆ set of vehicles (in our case - drones) revenue trips to be operated,

characterized by:

 starting point and time

 ending point and time

ˆ possible layover arcs between the end of the trip and the start of a
later trip at the same location

ˆ possible deadhead arcs connecting:

 depot (in our case - landing pads) to trip starting point, also
known as "pull-out"

 trip from ending point to depot (e.g landing pad), also known as
"pull-in"

 trip ending points to trip starting at dierent location (point)

In this project we are operating with predened inspection/monitoring paths,


so simple duty cycle of a single drone is presented as chain of predened
events connected by deadhead and layover arcs (Figure 26). Schedule can be

Figure 26: Simple duty cycle for one drone

59
Figure 27: Simulation-based scheduling system framework [12]

generated by "Mission Calculation Engine" during mission calculation, which


generates ight plan for mission with respect to physical models. Same ap-
proach for "Simulation-based scheduling framework" being proposed in [12].
Just like the "Mission Calculation Engine", it is based on "scenario gener-
ator  simulation model  visualization" architecture (Figure 27). Solving
the Scheduling problem consist of two steps:

1. Determine events, sequence and assign UAVs

2. Path calculation to guide UAV along sequence and avoid collisions

Sequence of events being calculated by Genetic Algorithm (GA). First, ran-


domly picked events forms a chromosome, sequenced as integers from 1 to N
(total number of upcoming events). Then, this chromosome being divided
between drones with respect to their "time windows" (time that drone can
spend in the air). Example of division of a chromosome, that consist of 10
tasks (events) can be seen in Figure 28. Time windows can be calculated
based on mean power consumption of the drone and weather conditions,
where temperature and wind strength are main values. How many tasks can
be carried by drone on a single battery capacity can be found by summarizing
time required for each event (we know it, since specications of most planned
events are known in advance) and time required for deadhead/layover arcs.
After that, GA generates several populations and nds the best sequence
division for each drone respectively. If we use the zone division principle,
then it makes it easier, as we do not need to assign a sequence of all events

60
Figure 28: Division of randomly generated GA chromosome [12]

to all available drones each time we want to perform inspection. It is already


divided by zones to belonging drones, so only sequence of executions has to
be calculated.
When it comes to scheduling of the indoor events, than it becomes slightly
dierent, as it is not possible to y inside (and will be not feasible in fore-
seeable future). Here, before implementation of autonomous ight in/out,
simplied scheduling will be used - generates timetable that contains only
what and when have to be inspected.
Scheduling algorithm which will be used, have to be exible in a certain
way: if there happen any unplanned situation which causes abort of action,
following functions are needed:

ˆ Schedule adaptation - in accordance with the occurred situation. So the

drones will continue to perform their tasks in a place where interruption


occurred and schedule will be updated respectively to delays

ˆ Schedule extension - in a situation, when tasks can not be resumed after

unexpected case is over, there should be possibility to automatically


update the upcoming schedule with resting tasks.

5.2.2 Path planning


Path planning has an important function to ensure optimal collision-free path
in complex environment. Due to dierent trajectories required in use-cases,
there is need to use dierent approaches for path calculations  in situations
when drones will y in complex, obstacle rich (clattered) environments (e.g.
transporting items within the ship) it is required to use more advanced 3D
path planning algorithms. In other situations, as for example inspection of
the hull or ying with constant attitude, 2D algorithms can be used.

61
For inspection missions we will use a set of pre-calculated patterns (way-
point sequences) for each individual inspection task. Basic inspection pat-
terns usually present vertical or horizontal strips (so-called zig-zag method)
to inspect at surfaces, spiral (cylinder) and Archimedes spiral for curved
surfaces [19] (Figure 29). In this case, main task for path planning algorithm

Figure 29: Basic inspection patterns: (a) strip method (b) Archimedes spiral
(c) spiral

is to optimally connect the takeo point with the starting point of the rst
inspection pattern, after that navigate the drone along inspections trajectory
then connect the ending point of the rst inspection pattern with starting
point of next pattern and so on till the connection of the ending point of last
inspection pattern with landing point (this ow is like scheduling timings in
Figure 26). To be able to use simpler 2D path planning algorithms, there is
need to split the 3D pattern into sequence of 2D paths and use additional
waypoints that will connect them together and navigate the drone around
obstacles. All these waypoints will form a node (waypoint) grid (Figure
30). Using that approach gives possibility to use simple heuristic algorithms
that are based on the nding of the shortest way between waypoints. Re-
sults shows [109] that among eight most popular algorithms
12 MILP (Mixed
Integer Linear Programming) algorithm shows most sucient results in rea-
sonable computations time compared to traditional A* or Dijkstra's. While
Genetic Algorithm, Potential Field and MSLAP are more eective relative
to computation time, their main disadvantage is that probability of non-
optimal path results rises in accordance with number of nodes (waypoints).

12
Potential Field, FLoyd-Warshall, Genetic Algorithm (GA), Greedy Algorithm and
Multi-Step Look-Ahead Policy (MSLAP), A*, Dijkstra's, Approximate Reinforcement
Learning (RL), MILP

62
Figure 30: Simple waypoint grid [13]

5.2.3 Collision avoidance


During automatic ights along calculated paths or in manual mode, we can
face with problem of possible collisions. Ability to detect obstacles in time
and avoid them is probably the most important feature of the drones. It
should handle uncontrollable weather impacts and non-constant light inten-
sity and ensure safe operations under dierent external conditions.
By collision we mean not the actual crash, but when distance between
drone(s) and an object is less than the determined threshold (collision ra-
dius). We can check if that threshold being violated by calculating the dier-
ence between position vectors of a drone and object (~
rd and ~ro respectively)
and compare it to the collision radius Rc 1 [14]:

k~rd − ~ro k < Rc (1)

What simplies the collision avoidance system is that we operate in a know


area, where locations of majority of the obstacles are known. So, we can
divide the collision awareness based on the obstacle types, in following way
(Figure 31):
During path calculation phase, we can already see if there will be any
possibility for path intersections or they will be in proximity from each other.
To reduce chance for collision with stationary obstacles already during com-
putation stage, we can use following rule: nodes presented by intersection of
paths or areas where several paths go close to each other, as same as dened
"safe" area around such locations, can be occupied only by one drone in a
time.
To avoid "idle" hovering while waiting for these nodes to be available,
the schedule must be set up such way that drones expected to visit these
places in a dierent time windows.

63
Figure 31: Example of obstacle gradation

If one or several drones being operated manually at the same time when
there are drones under automatic ight, we can use following approach: we
dene a "no-y" zone with certain radius around manually operated drone,
which cannot be accessed by drones in automatic ight. This is probably
most relevant during early stages of automation, because the "pure" manual
control is proered to be avoided as much as possible, whereas the "semi-
manual" will be used.
To avoid dynamic obstacles additional techniques with more restrictions
are needed. Flow of the collision avoidance process consist of several stages
(or steps): Sense→Detect→Avoid, Figure 32: At rst stage we use sensors

Figure 32: Structure of Collision avoidance system [14]

that are installed on the UAV to sense the certain surrounding area. There
are two types of sensors that can be used for that - active and passive. Dif-
ference between them is that active sensors both emit and detect reected
electromagnetic radiation, while passive only measure reected one. When
some obstacle being detected, computer calculates the probability of colli-
sion. Based on these calculations, the Collision Avoidance system compute
and perform required actions to avoid the threat (obstacle). After that drone
should return to normal operation and continue according to its plan.

64
When it comes to decision making about what types of sensing equip-
ment will be used for obstacle detection, individual assessment needs to be
performed. On one side we have the dependence of passive sensors (video
or IR cameras) on proper light conditions or quality of optical sensor, but
they have low power consumption and do not require to install additional
equipment (since IR and video cameras are already used as part of inspec-
tion equipment). Anyway, we will need additional sensors installed, because
sensing cameras ae able to cover space only in front of the drones, not all

360 . On the other side - high precision of active sensors (e.g. LiDAR) with
no limitations on weather but higher power consumption and increasing of
drone's total weight which will lead to shorter ight time.

5.2.4 Positioning
Outdoor navigation
Since FPSO is always in motion and need of high precision of drones'
attitude to match the characteristics of sensing equipment there is need to use
some advanced navigation technologies. Relay on GPS only in not sucient 
tests shows that precision of positioning falls radically caused by interference
between direct and reected signals (so-called "multipath propagation") from
present metal surfaces (vessel) [110]. Even we want to have the set up as
simple as possible to get a highly ecient system on low costs, there is
probability that there will be need to use several solutions for localization
and navigation. Operating several types of drones above the Arctic Circle
plays a signicant role for set up of such systems.
Because of dierence in operation of xed-wing and multirotor, we can
use dierent navigation detection approaches (Figure 33).
For the xed-wing drones, which will be used mostly for search and res-
cue, we can use the standard GPS solution, since it will not be aected
by distortion of operating near metals. To be able to safely land this type
of drones on a helideck there is need to have an additional system, such
as visual-based. Advantage of using of a visual approach for landing on a
helideck is that it is initially made as a high-contrast object to support he-
licopter landing 24/7 in dierent environmental conditions. Using a good
visible, contrast lighting makes it a natural reference point for landing of a
xed-wing drones at both day and night (Figures 34-a and 34-b). Position
of the vessel in always known, so in cases when drone approaches the ship
from a direction when helideck is closed behind other structures, it is possi-
ble to send the drone along circulating patter around the ship until helipad
becomes visible.

65
Figure 33: Proposed set up of outdoor navigation system

(a)

(b)

Figure 34: Helideck at dierent lighting conditions oshore: (a) night [15]
(b) daylight [16]

66
Use of visual-based localization system for multirotors can be possible
only during summer period or when lighting conditions are satised. One
of the possible methods is to use image matching algorithm [111]. Once we
have correctly extracted features (edges) and matched them to the reference
images, we can calculate the absolute position of the UAV. Unfortunately,
beside of the light dependence, reliability of that method depends on the dis-
tance from the object, because on short distances only few (if any) unique
edges image can be captured. So, this method is more applicable for sit-
uations (use-cases), when drone ies on a curtain distance from the ship 
overall monitoring, transportations, search, and others.
During arctic nights (or under any poor light conditions) visual-based
solutions can have greatly reduced performance due to distortions and airs
cause by vessel's onboard illumination and low feature (edges) extraction
due to not sucient lighting. To be able to overcome these restrictions,
as same as limitations of GPS, we need to use other methods. One of the
possible solutions is to use LPS (Local Positioning System). This approach
is based on a triangulation method and uses additional nodes (beacons or
transmitters) installed on the vessel. This method provides precision on
centimeter-level and does not interfere with conventional signals due to its
pulse-based high bandwidth (bandwidth of 500MHz or higher than 20% of
its center frequency [112]). Another positive side of a large bandwidth is
its multipath resistance, what is highly relevant in our case [113]. To nd a
distance between UWB transmitter and desired object several methods being
used: Time of arrival (TOA) and Time dierence of arrival (TDOA) [114].
When using TOA, we calculate the distance d from each beacon (minimum
four for 3D space) by formulas 2 (simplied) and 3 to nd coordinates of
an object in 3D space. Where c is speed of light, tarrival − rsent is the time
dierence between signal has been sent from a node and arrived to the drone,
[xref , yref , zref ] is the known position of one node.

d = c ∗ (tarrival − rsent ) (2)

q
d= (xref − x)2 + (yref − y)2 + (zref − z)2 (3)

After we get the set of four equations of form of 3 for at least four nodes, we
can nd exact [x, y, z] coordinates of a drone by calculating the intersection.
TDOA method is similar to TOA, but it based on a calculation of a
distance by sending a signal from drone to a node. Signal sent from a
drone being received by two nodes, and dierence between arriving time at
each node can be used to calculate the dierence in distances (∆d) between

67
drone and these nodes 4, Figure 35. Where [x1 , y1 , z1 ] and [x2 , y2 , z2 ] are
coordinates of nodes 1 and respectively and [x, y, z] are coordinates of a
drone. After equations for all four nodes are found, it is possible to solve a
system of equations to nd the coordinates of a drone.
q q
∆d = (x2 − x)2 − (y2 − y)2 − (z2 − z)2 − (x1 − x)2 − (y1 − y)2 − (z1 − z)2
(4)

Figure 35: Visualization of TDOA method (2D space)

Due to dependency of a UWB method on a VLOS, it can be extended


with Extended Kalman Filter to reduce localization estimation error when
drone is out of line of sight of UWB beacons [113].
Indoor navigation
For navigation inside cargo holds and ballast tanks presents diculties,
because it is impossible to establish stable communication between drone
and GPS or radio beacons installed on deck. So, here it is needed to use
additional, most probable, visual-based solutions. According to performed
research, there does not exist any solution to implement automated navi-
gation inside ships. Metall surroundings can cause reections and noise in
signal of active sensors and lack of unique characteristics of the shell plating
will make it dicult to use articial inelegance.
To solve the problem, we can apply special markings along plating, which
will help to identify position of the drone. They can be formed as a unique
QR or ArUco markers (Figure 36), which can be easily interpreted by drone's
visual system even in poor lighting conditions. It will give the possibility to
calculate approximate position of the drone. TO improve results we can use
stereo cameras, we can calculate distance and angle to these marks which
will give precise location of the inspection drone [115]. It also can be possible
to use the similar approach which would be used for navigation outside of the
vessel  triangulation by UWB modules [116] located inside the tanks/holds.
Unfortunately, having radio electronic components installed inside ATEX

68
Figure 36: Example of QR code (a) and ArUco (b)

Zone-0 areas can be dangerous, so more research needs to be done in this


eld. One of the possible solutions is to use (switch on) these nodes during
inspections only, when tanks are approved to be safe.

5.3 UAS - subsystems and supply infrastructure


Beside of the landing pads or catapults, there is need to have additional
equipment on the vessel to ensure trouble-free operation of the drones. Basi-
cally, supply infrastructure can be divided into two parts: Remote navigation
assistance and Computing task ooading (Figure 37) [117].

Figure 37: Supply infrastructure  communication architecture

Remote navigation assistance is responsible for everything related to the


ight logistics' computations and control. It consists of ight computer and
telecommunication unit for transmission of telemetry data. Telemetry con-
tains information about drone's subsystems status, such as [118]:

69
ˆ Voltage of the batteries

ˆ Current consumption

ˆ Battery temperature

ˆ Flight controller operating mode

ˆ Total ight time

ˆ Altitude

ˆ Linear velocity

ˆ Inertial measurement data (from gyroscope and accelerometer)

ˆ Motors' speed (rpm)

ˆ Current position

ˆ Status of sensing equipment

ˆ Current mission progress

Based on this two-way communication computer can automatically steer


drones and operator can keep control on progress of the mission and overall
health status of the drones. To reduce power consumption and to minimize
possible interference with other onboard radio equipment, telemetry data
being send in packages in time intervals. Even ight control computer can
provide fast rerouting to avoid collisions, it is preferable to have independent
ight processor onboard the drone. It will ensure collision avoidance with
minimized latencies, as same as accident-free ight on case of communication
loss. For double redundancy entire path and algorithms for how to react in
possible critical situations (e.g. communication loss or signal disturbances)
must be uploaded and stored in drone's onboard memory.
Computing task ooading subsystem is responsible for all complex cal-
culations that would overload the processing subsystems onboard the drone
or take more time due to processing limitations. As example, it can be image
or contact-sensor data processing during inspection. In this case it is bet-
ter to transfer captured data to stationary computer and process it there to
ensure fast computations. This subsystem consists of one-way telecommu-
nication antennas which receives data from the drone, which being further
passed to Data processing computer. Processed video data will be also used
as visual odometry by Flight computer to correct the trajectory of the drone
if needed.

70
6 Discussion
Even though drones have existed and being used for a long time by military
and hobbyists, their civil use is relatively young and still rapidly growing.
Being able to reach remote places without setting crew or survey team in
danger, they are used for visual inspections in majority of industry sectors.
During last few years interest to use UAVs in maritime has also raised. De-
spite of several challenges, their performance seems to be promising as inno-
vative technologies being developed. The overall complexity of this project
is that there does not exist any autonomous inspection or monitoring solu-
tion for maritime. There are only few projects that are related to use of
aerial vehicles for maritime needs, as it was described in Literature review
section, and all of them are still on development stage.

6.1 Technological and Regulatory gaps


There have been several times mentioned technological and regulatory gaps
during project. Unfortunately, on present level of technology there is few
possibilities to implement fully autonomous inspection system, especially
for internal inspections. It is based not only on incompleteness of some
technologies but also impossibilities due to technical structure of the ship
itself. One of the examples is realization of automated y in/out to ballast
tanks and cargo holds  there is no possibility to get into these directly from
the main deck, where all landing pads are planned to be installed. They are
accessible only via manhole-sized hatches from inside of the vessel. Another
problem is the navigation inside of those areas  getting high disturbances
caused by multipath propagation makes it impossible to use radionavigation
methods.
Use of aerial manipulators, especially if they will be used for NDT contact-
based testing, requires additional research, what will allow us to use them
in more complex weather conditions. Even, if there has already been used
drones for onshore contact surveys, they are normally done under most opti-
mal weather conditions. While operating oshore in most cases means ying
in tough (i.e., inclement) weather.
Any professional drone operations need to be covered by laws and regula-
tions, which binds dierent actors together, ensures order and uniform view
on dierent things. Unfortunately, there does not exist any regulation that
will cover the aspect of automated oshore drone operations. Those that
exists are related to manual piloting with one per time intentions rather
than continuous ights.

71
6.2 Use-case scenarios
In section 3 we have seen the proposed possible use-case scenarios. Based on
present technological level of drone's development, implementation of com-
plex missions, such as use of aerial manipulators for actual maintenance
(i.e., repairing) is not possible. It is not only because of it will require use of
much heavier multirotors, but they will have another requirement level for
power supplement, so traditional batteries will not be sucient, and need
to be replaced by tethers. Which in turn will rise the diculty of access to
remote or clattered areas. Another limitation that comes with dimension
increasement of drones (which is not related to tether) is that ships, par-
ticularly FPSOs, have quite compact arrangement, that will deny passage
between construction elements or access through manhole-sized hatches of
bigger drones. Additionally, all complicated repairing works of the hull or
other constructions are meant to be done during drydocking. Thus, having
complex and heavy drones that has limited eld of application is not worth
the costs, so it could better to concentrate on nding than xing. That leads
to those most common tasks will be visual-based and contact-based inspec-
tions. Visual inspections using video or IR cameras are quite simple and not
costly, but at the same time are quite eective. Their main disadvantage is
that they can suer from both poor lighting and solar radiation (during day-
light). To minimize the possibilities of misdetection, they will be supported
by drones with contact-based sensing equipment. This kind of division by
attached implements is done to reduce number of sophisticated drones, thus
reduce costs. Another reason is that frequency of missions, where these ma-
nipulators or other special equipment will be used, is expected to be low. So,
it is not worthen to carry all the time equipment that will not be used for
most of the ight time. At the same time, having ability to inspect hard-to-
reach places will signicantly reduce load on inspection team and minimize
risks for injury.
Reason for including of search-and-rescue use-cases is that FPSO will be
located at a sucient distance from the nearest rescue base, so in case of any
emergency it can take up to eight hours before rst help arrives. In these
situations, drones can become handy, as they can be deployed in minutes.

6.3 Implementation sequence


Since it is not possible to implement fully autonomous UAV-based system
right from the outset, we can try to propose possible sequence of imple-
mentation process. To achieve that we will be based on existing autonomy

72
gradient (Tables 6 and 7). Main factor that can aect and make changes
in it, is development speed of technologies and dedicated regulations. Given
the fact that European Commission has initiated the launch of the similar
project (ROBINS), it shows high necessity to ll existing gaps, we can
expect movements in that direction in nearest future.

73
7 Conclusion and Future work

7.1 Conclusions
This study has been done in order to propose possible use-case scenarios
and as an attempt to set up the general Concept and Framework in order
to implement the fully autonomous UAV-based inspection setup that could
be used on FPSO vessels, mainly on Johan Castberg. To do that there has
been performed research of the key components that were seen as important
or could make a signicant impact on the general development and imple-
mentation. Because of there does not exist similar system, focus was on
nding possible components and algorithms, and propose general ow of the
potential arrangement.
Performed research shows that in spite of the existing regulatory and
technological gaps, development and further implementation of such inspec-
tion system is feasible, with some restrictions:
1. Weather conditions in the Barents Sea periodically may not allow op-
eration 24/7.

2. ATEX regulations put strong restrictions on the equipment, what re-


duce choice of approved equipment

3. Autonomous inspection of the cargo holds and ballast tanks cannot be


performed by UAVs, if they will need to takeo from the main deck.

If implementation will be performed in steps as it being proposed in Chapter


4.7 (i.e. from simple to complex) results that have been proposed in this
project can be used as a base for further Detailed Design and Analysis
stage. Otherwise, this work has to be extended to cover existing gaps rst.

7.2 Future work


To dene concept and framework of a such complex inspection system is
quite challenging, what normally requires cooperation of people that have
knowledge in various elds of science. Thus, this project can be seen as the
basis for future development of that system. Before we get regulatory side
in place, it is possible to focus on the technical aspects, both practical and
theoretical.

74
7.2.1 Practical aspects
We relate practical aspects to the practical realization and use. Even though
it can be seen as part of low-level design, it is important to have them in
place to be able to extend the system's capabilities.

ˆ To get proper possibilities to perform contact-based inspections, addi-

tional study on practical use of aerial manipulators and NDT test-


ing needs to be done. It will also allow to expand the list of use-case
scenarios, what will increase total eectiveness.

positioning and navigation in GPS-


ˆ Another important point is the

denied areas, such cargo holds and ballast tanks.


ˆ There is a need to develop control strategies for the drones, so that
there will be possibility to perform contact-based inspections in rough
weather.

7.2.2 Theoretical aspects


Outlining theoretical aspects is an important part of concept generation
process. Understanding the requirements and correctly placed accents can
signicantly reduce probability of failure and increase total eectiveness of
the system in the future.

ˆ In this work, we have focused on use of the unmanned aerial vehicles

opportunity to use unmanned underwater


(UAV) only. However,
vehicles (UUV) and unmanned ground vehicles (UGV) is of
no less interest.

perform a feasibility study for implementation


ˆ There is a need to

of autonomous y-in and y-out into the FPSO's cargo holds


and ballast tanks (or look for possibilities to use another types
of unmanned vehicles), because existing regulations and technical
limitations do not give any possibility for that today.

75
References
[1] Olje- og gass energidepartementet: Prop. 80 s (2017-2018) utbygging
og drift av johan castberg-feltet med status for olje- og gassvirk-
somheten. https://www.regjeringen.no/no/dokumenter/prop.-80-s-
20172018/id2596504/?ch=3. Visited: 15.12.2020.

[2] National weather service: Denition of twilight, URL:


https://www.weather.gov/fsd/twilight, visited: 20.12.2020.

[3] PETZL,  Classication of ATEX zones.


https://www.petzl.com/INT/en/Professional/Classication-of-
ATEX-zones?ActivityName=Explosive-atmosphere. Visited:
20.12.2020.

[4] C. Daley, Lecture notes for engineering 5003  ship structures i, tech.
rep., Memorial University St. John's, Canada.

[5] Flyability: Elios 2  intuitive indoor inspection, URL:


https://www.yability.com/elios-2 , visited: 15.02.2021.

[6] A. Dehghani-Sanij, S. Dehghani, G. Naterer, and Y. Muzychka, Sea


spray icing phenomena on marine vessels and oshore structures: Re-
view and formulation, Ocean Engineering, vol. 132, pp. 2539, mar
2017.

[7] D. Wyatt, Eagle Eye Pocket Guide. Bell Helicotpter - A Textron Com-
pany, June 2005.

[8] Carbonix brochure: Next generation drone technology, 2021.

[9] J. A. Besada, L. Bergesio, I. Campaña, D. Vaquero-Melchor, J. López-


Araquistain, A. M. Bernardos, and J. R. Casar, Drone mission deni-
tion and implementation for automated infrastructure inspection using
airborne sensors, Sensors, vol. 18, no. 4, p. 1170, 2018.
[10] B. Alemour, O. Badran, and M. R. Hassan, A review of using con-
ductive composite materials in solving lightening strike and ice accu-
mulation problems in aviation, Journal of Aerospace Technology and
Management, 2019.
[11] R. Hann, A. Enache, M. C. Nielsen, B. N. Stovner, J. van Beeck, T. A.
Johansen, and K. T. Borup, Experimental heat loads for electrother-
mal anti-icing and de-icing on UAVs, Aerospace, vol. 8, p. 83, mar
2021.

77
[12] I. Sung, K. Danancier, D. Ruvio, A. Guillemet, and P. Nielsen, A de-
sign of a scheduling system for an unmanned aerial vehicle (UAV) de-
ployment, IFAC-PapersOnLine, vol. 52, no. 13, pp. 18541859, 2019.
[13] B. M. Sathyaraj, L. C. Jain, A. Finn, and S. Drake, Multiple UAVs
path planning algorithms: a comparative study, Fuzzy Optimization
and Decision Making, vol. 7, pp. 257267, jun 2008.
[14] J. N. Yasin, S. A. S. Mohamed, M.-H. Haghbayan, J. Heikkonen,
H. Tenhunen, and J. Plosila, Unmanned aerial vehicles (UAVs):
Collision avoidance systems and approaches, IEEE Access, vol. 8,
pp. 105139105155, 2020.

[15] Q-Aviation, Helipad and helideck lightsvisited: 11.05.2021. Avail-


able: https://www.qaviation.nl/helideck-lighting-systems. Visited:
11.05.2021.

[16] ShoreConnection, Helideck monitoring products and


electrical systems to oshore available:
helidecks,
https://www.shoreconnection.no/helideck-monitoring-products/ ,
visited: 11.05.2021. https://www.shoreconnection.no/helideck-
monitoring-products/. Visited: 11.05.2021.

[17] R. Parasuraman, T. Sheridan, and C. Wickens, A model for types and


levels of human interaction with automation. ieee trans. syst. man cy-
IEEE transactions on systems,
bern. part a syst. hum. 30(3), 286-297,
man, and cybernetics. Part A, Systems and humans : a publication of
the IEEE Systems, Man, and Cybernetics Society, vol. 30, pp. 28697,
06 2000.

[18] Done by the author of this thesis,  Literature Study  Autonomous


Drone Inspection. Preliminary Project. University of Tromsø. Dec.
2020.

[19] T. Rakha and A. Gorodetsky, Review of unmanned aerial system


(uas) applications in the built environment: Towards automated build-
ing inspection procedures using drones, Automation in Construction,
vol. 93, pp. 252264, 2018.

[20] C. Eschmann and T. Wundsam, Web-based georeferenced 3d inspec-


tion and monitoring of bridges with unmanned aircraft systems, Jour-
nal of Surveying Engineering, vol. 143, no. 3, p. 04017003, 2017.

78
[21] R. Steen and W. Förstner, On visual real time mapping for un-
21st congress of the international society
manned aerial vehicles, in
for photogrammetry and remote sensIng (ISPRS), pp. 5762, Citeseer,
2008.

[22] E. Carrara and A. Grasso, Robotics technology for inspection of


ships,in 2020 25th IEEE International Conference on Emerging
Technologies and Factory Automation (ETFA), vol. 1, pp. 15261533,
IEEE, 2020.

[23] European Comission,  What is Horizon 2020?.


https://ec.europa.eu/programmes/horizon2020/what-horizon-2020.
Visited: 15.12.2020.

[24] ROBINS, Autonomous ight capabilities for inspection of


cargo holds. https://www.robins-project.eu/uib-drone/. Visited:
15.12.2020.

[25] Safety4sea,  Drone successfully inspects oil tank on FPSO.


https://safety4sea.com/drone-successfully-inspects-oil-tank-on-fpso/.
Visited: 15.12.2020.

[26] inside unmanned systems,  Cyberhawk Deploys UAS for


rst ABS class survey and inspection of and oil tanker.
https://insideunmannedsystems.com/cyberhawk-deploys-uas-for-
rst-abs-class-survey-and-inspection-of-an-oil-tanker/. Visited:
15.12.2020.

[27] microdrones,  Microdrones MD4-1000 completes in-


spection at the biggest oil rig in the world.
https://www.microdrones.com/en/content/uav-inspection-at-the-
biggest-oil-rig-in-the-world/. Visited: 15.12.2020.

[28] DNV GL, The drone squad for ship surveys.


https://www.dnvgl.com/expert-story/maritime-impact/The-drone-
squad-for-ship-surveys.html. Visited: 15.12.2020.

[29] NDT Services, Drone inspection of oshore oil and gas constructions.
https://forcetechnology.com/en/services/drone-inspection-oshore-
oil-gas-constructions. Visited: 15.12.2020.

[30] D. Mellinger, Q. Lindsey, M. Shomin, and V. Kumar, Design, mod-


eling, estimation and control for aerial grasping and manipulation,

79
in2011 IEEE/RSJ International Conference on Intelligent Robots and
Systems, pp. 26682673, 2011.
[31] P. J. Cruz and R. Fierro, Cable-suspended load lifting by a quadrotor
UAV: hybrid model, trajectory generation, and control, Autonomous
Robots, vol. 41, pp. 16291643, apr 2017.
[32] C. Korpela, M. Orsag, and P. Oh, Towards valve turning using a dual-
arm aerial manipulator, in 2014 IEEE/RSJ International Conference
on Intelligent Robots and Systems, pp. 34113416, IEEE, 2014.
[33] A. Ollero, G. Heredia, A. Franchi, G. Antonelli, K. Kondak, A. San-
feliu, A. Viguria, J. R. Martinez-de Dios, F. Pierri, J. Cortes,
A. Santamaria-Navarro, M. A. Trujillo Soto, R. Balachandran,
J. Andrade-Cetto, and A. Rodriguez, The aeroarms project: Aerial
robots with advanced manipulation capabilities for inspection and
maintenance, IEEE Robotics Automation Magazine, vol. 25, no. 4,
pp. 1223, 2018.

[34] P. J. Sanchez-Cuevas, P. Ramon-Soria, B. Arrue, A. Ollero, and


G. Heredia, Robotic system for inspection by contact of bridge beams
using uavs, Sensors, vol. 19, no. 2, p. 305, 2019.
[35] M. n. Trujillo, J. R. Martínez-de Dios, C. Martín, A. Viguria, and
A. Ollero, Novel aerial manipulator for accurate and robust indus-
trial ndt contact inspection: A new tool for the oil and gas inspection
industry, Sensors, vol. 19, no. 6, 2019.
[36] RobotWorx,  Grippers for Robots.
https://www.robots.com/articles/grippers-for-robots. Visited:
26.01.2021.

[37] Robotics Industrial Association, What is an end eector


and how do you use one?. https://www.robotics.org/content-
detail.cfm/Industrial-Robotics-News/What-is-an-End-Eector-and-
How-Do-You-Use-One/content_id/9134. Visited: 26.01.2021.

[38] M. Fanni and A. Khalif, A new 6-dof quadrotor manipulation system:


Design, kinematics, dynamics, and control., IEEE/ASME Transac-
tions on Mechatronics 22.3, pp. 13151326, 2017.
[39] G. Heredia, A. Jimenez-Cano, I. Sanchez, D. Llorente, V. Vega,
J. Braga, J. Acosta, and A. Ollero, Control of a multirotor outdoor
aerial manipulator, sep 2014.

80
[40] Equinor, Johan casberg. https://www.equinor.com/no/what-we-
do/new-eld-developments/johan-castberg.html. Visited: 15.12.2020.

[41]  ESIMO: Barentsevo more. Osnovnyye rezhimoobrazuyushchiye


faktory [Barents Sea: The main regime-forming factors]
http://esimo.oceanography.ru/esp2/index/index/esp_id/5/section_id/2/menu_id/2946.
Visited: 15.12.2020.

[42] E. Kolstad, A quikscat climatology of ocean surface winds in the


nordic seas: Identication of features and comparison with the
ncep/ncar reanalysis, Journal of Geophysical Research: Atmospheres,
vol. 113, no. D11, 2008.

[43] DTN, Sea conditions guide: The north, norwegian and barents sea,
USA, 2019.

[44] Barents Watch,  Polar Lows Explained.


https://www.barentswatch.no/en/services/polar-lows-explained/.
Visited. 15.12.2020.

[45] C. Dezecot and K. J. Eik, Barents east blocks metocean design basis,
tech. rep., BaSEC, Nov. 2015.

[46] D. González-Aguilera, S. Lagüela, P. Rodríguez-Gonzálvez, and


D. Hernández-López, Image-based thermographic modeling for as-
sessing energy eciency of buildings façades, Energy and Buildings,
vol. 65, pp. 29  36, 2013.

[47] IACS, Requirements concerning survey and sertication, 2020.

[48] Luftfartstilsynet, Et overblikk over det nye felleseuropeiske


regelverket. https://luftfartstilsynet.no/droner/nytt-eu-regelverk/et-
overblikkhva-skjer/. Visited: 05.02.2021.

[49] Luftfartstilsynet, Åpen category (open category).


https://luftfartstilsynet.no/droner/nytt-eu-regelverk/apen-kategori/,
2021. Visited: 24.04.2021.

[50] Luftfartstilsynet, C-merkign og klassisering av drones.


https://luftfartstilsynet.no/droner/nytt-eu-regelverk/c-merking-
av-droner/. Visited: 24.04.2021.

[51] Forskrift om luftfartøy som ikke har fører om bord mv, Lovdata, 2016
(updated 2021).

81
[52] L. O. Stava, OM105.19 - Sikker bruk av UAS (drone). Equinor, Dec.
2020.

[53] EASA, Easy Access Rules for Standardised European Rules of the Air
(SERA), Dec. 2020.
[54] Annex to the draft commision regulation on "air operations - ops",
EASA.
[55] Directive 2014/34/eu of the european parlament and of the council
of 26 february 2014 on the harmonisation of the laws of the member
states relating to equipment and protectivesystems intended for use
in potentially explosive atmospheres (recast), Ocial Journal of the
European Union, 2014.
[56] Directive 1999/92/ec of the european parlament and of the coun-
cil of 16 december 1999 on minimum requirements for improving the
safety and health protection of workers potentially atrisk from explo-
sive atmospheres (15th individual directive within the meaning of ar-
ticle 16(1) of directive 89/391/eec), Ocial Journal of the European
Communities, 2000.
[57] Forskrift om utstyr og sikkerhetssystem til bruk i eksplosjonsfarlig om-
rådeforskrift om utstyr og sikkerhetssystem til bruk i eksplosjonsfarlig
område, Lovdata, 2017.
[58]  Class 1/ Division 2 and ATEX Zone 2 Explained.
https://www.assured-systems.com/uk/news/article/class-1division-
2-and-atex-zone-2-explained/. Visited: 02.02.2021.

[59] Hazardous area classication and control of ignition sources, Health


and Safet Executive, 2004.
[60] Intrinsically safe drone, Intrinsically Safe Store, 2019.
[61] Explosion proof drone - dist 4km - c1d1 c2d1 - nec/cec atex z1 iecex,
ATEX shop.
[62] IMO, Guidelines on the enhanced programme of inspection during sur-
veys of bulk carriers and oil tankers, res. a.744(18) ed., 1993.
[63] B. Vasconcelos de Farias and T. Antoun Netto, Fpso hull structural
integrity evaluation via bayesian updating of inspection data, Ocean
Engineering, vol. 56, pp. 1019, 2012.

82
[64] J. Goyet, V. Boutillier, and A. Rouhan, Risk based inspection for
oshore structures, Ships and Oshore Structures, vol. 8, pp. 303
318, jun 2013.

[65] JDP, How do you measure the size of a manhole cover?.


https://www.jdpipes.co.uk/knowledge/manhole-covers/measure-
manhole-cover-size.html. Visited: 20.12.2020.

[66] J. Palomba and M. MScEcon, Unmanned aerial vehicle inspections


and environmental benets, in 15th Asia Pacic Conference for Non-
Destructive Testing, 2017.
[67] M. Fingas and C. E. Brown, A review of oil spill remote sensing,
Sensors, vol. 18, no. 1, 2018.
[68] C. Yuan, Z. Liu, and Y. Zhang, Fire detection using infrared images
2017 International Conference
for uav-based forest re surveillance, in
on Unmanned Aircraft Systems (ICUAS), pp. 567572, 2017.
[69] T. De Kerf, J. Gladines, S. Sels, and S. Vanlanduit, Oil spill detection
using machine learning and infrared images, Remote Sensing, vol. 12,
no. 24, 2020.

[70] E. Atherton, D. Risk, C. Fougère, M. Lavoie, A. Marshall, J. Werring,


J. P. Williams, and C. Minions, Mobile measurement of methane emis-
sions from natural gas developments in northeastern british columbia,
canada, Atmospheric Chemistry and Physics, vol. 17, pp. 12405
12420, oct 2017.

[71] NAESB, Natural Gas Specs Sheet.


[72] T. R. Bretschneider and K. Shetti, Uav-based gas pipeline leak detec-
tion, in Proc. of ARCS, 2015.
[73] C. D. Rodin, L. N. de Lima, F. A. de Alcantara Andrade, D. B. Had-
dad, T. A. Johansen, and R. Storvold, Object classication in thermal
images using convolutional neural networks for search and rescue mis-
sions with unmanned aerial systems, jul 2018.

[74] M. A. Orgun and J. Thornton, AI 2007: Advances in Articial In-


telligence: 20th Australian Joint Conference on Articial Intelligence,
Gold Coast, Australia, December 2-6, 2007, Proceedings, vol. 4830.
Springer, 2007.

83
[75] J. Lilja, V. Pynttari, T. Kaija, R. Makinen, E. Halonen, H. Sillanpaa,
J. Heikkinen, M. Mantysalo, P. Salonen, and P. de Maagt, Body-worn
antennas making a splash: Lifejacket-integrated antennas for global
search and rescue satellite system,IEEE Antennas and Propagation
Magazine, vol. 55, pp. 324341, apr 2013.
[76] J. Mendoza-Mendoza, V. J. Gonzalez-Villela, C. Aguilar-Ibanez,
S. Suarez-Castanon, and L. Fonseca-Ruiz, Snake aerial manipulators:
A review, IEEE Access, vol. 8, pp. 2822228241, 2020.
[77] X. DING, P. GUO, K. XU, and Y. YU, A review of aerial manipulation
of small-scale rotorcraft unmanned robotic systems, Chinese Journal
of Aeronautics, vol. 32, pp. 200214, jan 2019.
[78] A. Dehghani-Sanij, S. Dehghani, G. Naterer, and Y. Muzychka, Ma-
rine icing phenomena on vessels and oshore structures: Prediction
and analysis, Ocean Engineering, vol. 143, pp. 123, oct 2017.
[79] T. Rashid, H. A. Khawaja, and K. Edvardsen, Review of marine icing
and anti-/de-icing systems, Journal of Marine Engineering & Tech-
nology, vol. 15, pp. 7987, may 2016.
[80] Aerones,  DRONE Solutions. https://www.aerones.com/other/drone/.
Visited: 11.03.2021.

[81] DJI,  AGRAS MG-1. https://www.dji.com/no/mg-1. Visited:


13.03.2021.

[82] DJI, AGRAS MG-1 User Manual. DJI, 1.2 ed.

[83] M. Hassanalian and A. Abdelke, Classications, applications, and


design challenges of drones: A review, Progress in Aerospace Sciences,
vol. 91, pp. 99131, may 2017.

[84] Handbook of unmanned aerial vehicles, 2015.

[85] R. PS and M. L. Jeyan, Mini unmanned aerial systems (UAV) - a re-


view of the parameters for classication of a mini UAV, International
Journal of Aviation, Aeronautics, and Aerospace, 2020.
[86] B. Cheng and Z. Guo, Study on small UAVs' deep stall landing pro-
cedure, aug 2017.

84
[87] K. Klausen, T. I. Fossen, and T. A. Johansen, Autonomous recovery
of a xed-wing uav using a net suspended by two multirotor uavs,
Journal of Field Robotics, vol. 35, no. 5, pp. 717731, 2018.
[88] Q. Quan, Introduction to Multicopter Design and Control. Springer
Singapore, 2017.

[89] B. Theys, G. Dimitriadis, P. Hendrick, and J. De Schutter, Inuence of


propeller conguration on propulsion system eciency of multi-rotor
unmanned aerial vehicles, in 2016 international conference on un-
manned aircraft systems (ICUAS), pp. 195201, IEEE, 2016.
[90] C. Ampatis and E. Papadopoulos, Parametric design and optimiza-
Applications of Mathematics and
tion of multi-rotor aerial vehicles, in
Informatics in Science and Engineering, pp. 125, Springer, 2014.
[91] V. Lippiello, F. Ruggiero, and D. Serra, Emergency landing for a
quadrotor in case of a propeller failure: A PID based approach, oct
2014.

[92] S. Sun, M. Baert, B. S. van Schijndel, and C. de Visser, Upset recovery


control for quadrotors subjected to a complete rotor failure from large
initial disturbances, may 2020.

[93] A. Lanzon, A. Freddi, and S. Longhi, Flight control of a quadrotor


Journal of Guidance, Control,
vehicle subsequent to a rotor failure,
and Dynamics, vol. 37, pp. 580591, mar 2014.
[94] S. J. Lee, I. Jang, and H. J. Kim, Fail-safe ight of a fully-actuated
quadrotor in a single motor failure, IEEE Robotics and Automation
Letters, vol. 5, pp. 64036410, oct 2020.
[95] Y. Wu, K. Hu, X.-M. Sun, and Y. Ma, Nonlinear control of quadrotor
for fault tolerance: A total failure of one actuator, IEEE Transactions
on Systems, Man, and Cybernetics: Systems, pp. 111, 2019.
[96] IACS, Z7  hull classication surveys, 2020.

[97] T. Sattar, H. L. Rodriguez, J. Shang, and B. Bridge, Automated NDT


of oating production storage oil tanks with a swimming and climbing
robot, pp. 935942, 2006.

[98] N.-D. Hoang, Image processing-based pitting corrosion detection


using metaheuristic optimized multilevel image thresholding and

85
machine-learning approaches, Mathematical Problems in Engineering,
vol. 2020, pp. 119, may 2020.

[99] N.-D. Hoang and V.-D. Tran, Image processing-based detection of


pipe corrosion using texture analysis and metaheuristic-optimized
machine learning approach, Computational Intelligence and Neuro-
science, vol. 2019, pp. 113, jul 2019.
[100] Mudge P.J, Tuncbilek K., Haig A.G., Non-invasive monitoring of
ships for corrosion using ultrasonic guided waves, The "ShipInspec-
tor" project, 2012.
[101] T. Nelligan, An introoduction to ultrasonic thickness gag-
ing. https://www.olympus-ims.com/en/applications-and-
solutions/introductory-ultrasonics/introduction-thickness-gaging/.
Visited: 23.02.2021.

[102] T. Perez and T. Fossen, Kinematics of ship motion, pp. 4558.

[103] IACS Recc, IACS Reccomentadion N42: "Guideline for Use of Romte
Inspection Techniques for Surveys", rev.2 ed., 2016.
[104] T. Command,  Incident Management Software.
https://www.tabletcommand.com, 2021. Visited: 14.04.2021.

[105] R. Motorin, M. Pigulsky, and O. Piskun, Russian soldier of the fu-


ture, 2018.

[106] R. Hann, A. Wenz, K. Gryte, and T. A. Johansen, Impact of atmo-


spheric icing on UAV aerodynamic performance, oct 2017.

[107] ice Shield,  De-Icing Boots | Wing De-Icers for Aircraft, OEM and
Aftremarket Fitments. https://www.iceshield.com/Products/Wing.
Visited: 23.04.2021.

[108] N. Wilson, G. Sanchez-Martinez, and N. Nassir, 1.258j public trans-


portation systems, in Massachusetts Institue of Technology: MIT,
OpenCourceWare, 2017. Visited: 27.04.2021.

[109] M. Radmanesh, M. Kumar, P. H. Guentert, and M. Sarim, Overview


of path-planning and obstacle avoidance algorithms for UAVs: A com-
parative study, Unmanned Systems, vol. 06, pp. 95118, apr 2018.

86
[110] A. Mohamed, M. Doma, and M. Rabah, Study the eect of surround-
ing surface material types on the multipath of gps signal and its im-
pact on the accuracy of positioning determination, American Journal
of Geographic Information System, pp. 199205, 10 2019.
[111] G. Conte and P. Doherty, An integrated uav navigation system based
on aerial image matching, in 2008 IEEE Aerospace Conference, pp. 1
10, 2008.

[112] I. Glover and R. Atkinson, Overview of wireless techniques, pp. 133,


2017.

[113] T. M. Nguyen, A. H. Zaini, K. Guo, and L. Xie, An ultra-wideband-


based multi-uav localization system in gps-denied environments, in
2016 International Micro Air Vehicles Conference, 2016.
[114] H. S. Hasan, M. Hussein, S. M. Saad, and M. A. M. Dzahir, An
overview of local positioning system: Technologies, techniques and ap-
plications, International Journal of Engineering & Technology, vol. 7,
p. 1, aug 2018.

[115] A. Zaarane, I. Slimani, W. Al Okaishi, I. Atouf, and A. Hamdoun,


Distance measurement system for autonomous vehicles using stereo
camera, Array, vol. 5, p. 100016, 2020.
[116] C. Wang, H. Zhang, T.-M. Nguyen, and L. Xie, Ultra-wideband aided
fast localization and mapping system, in 2017 IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems (IROS), pp. 1602
1609, 2017.

[117] S. Baidya and M. Levorato, On the feasibility of infrastructure assis-


tance to autonomous uav systems, in2020 16th International Confer-
ence on Distributed Computing in Sensor Systems (DCOSS), pp. 296
303, 2020.

[118] M. P. Vasylenko and I. S. Karpyuk,  TELEMETRY SYSTEM OF UN-


MANNED AERIAL VEHICLES, Electronics and Control Systems,
vol. 3, dec 2018.

87
Appendix A Regulation on aircraft without pilot
onboard, selected paragraphs (origi-
nal text in Norwegian)

Ÿ 51. Sikkerhetsavstander, maksimal ygehøyde


All yging må skje på en hensynsfull måte som ikke utsetter luftfartøy,
personer, fugler, dyr eller eiendom for risiko for skade eller for øvrig er til
sjenanse for allmennheten.
Luftfartøyet må til enhver tid være godt synlig for den som fører det.
Ved enhver yging skal det holdes nødvendige sikkerhetsavstander. Det er
ikke tillatt å y

(a) høyere enn 120 meter over bakken eller vannet

(b) nærmere enn 150 meter fra folkeansamling på mer enn 100 personer

(c) nærmere enn 50 meter fra personer, motorkjøretøy eller bygning som
ikke er under pilotens og fartøysjefens kontroll.

Luftfartøy som har en MTOM på 250 gram eller mindre, kan ys VLOS,
EVLOS eller BLOS, men ikke høyere enn 50 meter over bakken eller vannet.
Sikkerhetsavstandene i andre ledd bokstav b og c gjelder ikke.
Flyging ut over det som følger av sikkerhetsavstandene i andre og tredje
ledd, kan bare utføres av RO 3-operatør i tråd med bestemmelsene i kapittel
9 og for øvrig de vilkår som er gitt i tillatelsen.

Ÿ 56. BLOS
Flyging BLOS er kun tillatt hvis tillatelsen fra Luftfartstilsynet omfatter
denne operasjonstypen.

Ÿ 57. BLOS-yging opp til 120 meter i luftrom klasse G


BLOS-yging opp til 120 meter i luftrom klasse G eller luftrom klasse G
med etablert Radio Mandatory Zone (RMZ), kan kun skje hvis det er utstedt
NOTAM for å informere om aktiviteten. NOTAM skal være utstedt minst
12 timer før aktiviteten påbegynnes.
BLOS-yging i luftrom klasse G med etablert Radio Mandatory Zone
(RMZ) kan i særlige tilfeller likevel skje etter tillatelse fra ygeinformasjon-
stjenesten og på de vilkår som ygeinformasjonstjenesten setter. Flygein-
formasjonstjenesten kan kun gi tillatelse til slik yging hvis det er klart at
ygingen kan gjennomføres sikkert og uten å hindre øvrig lufttrakk.

88
Ÿ 58. BLOS-yging opp til 120 meter i kontrollert luftrom
BLOS-yging opp til 120 meter i kontrollert luftrom kan kun skje i aktive
fare- eller restriksjonsområder.
BLOS-yging kan unntaksvis skje utenfor fare- eller restriksjonsområde,
etter klarering fra ygekontrolltjenesten og på de vilkår som ygekontrollt-
jenesten setter. Klarering skal kun gis hvis det kan etableres tilfredsstillende
atskillelse mellom luftfartøyet som ikke har fører om bord og ethvert annet
luftfartøy.

Ÿ 59. Påbudt lys


For all yging BLOS skal luftfartøyet være utrustet med lavintense lys,
hvitt med minst 10 candela, hvor blink fremkalles ved roterende lys (stro-
belys) og med minimum 20 blink i minuttet.

Ÿ 60. Flyging i mørke


Ved yging i mørke skal luftfartøyet ha belysning i samsvar med kravene
i forordning (EU) nr. 923/2012, SERA.3215, gjennomført ved forskrift 14.
desember 2016 nr. 1578 om lufttrakkregler og operative prosedyrer.

89

You might also like