Ship Sensors 2024
Ship Sensors 2024
Ship Sensors 2024
R. Glenn Wright
Cover image: © Johnny Haglund/Getty Images
First published 2024
by Routledge
605 Third Avenue, New York, NY 10158
and by Routledge
4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2024 R. Glenn Wright
The right of R. Glenn Wright to be identified as author of this work has been
asserted in accordance with sections 77 and 78 of the Copyright, Designs and
Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or utilised
in any form or by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying and recording, or in any information
storage or retrieval system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or registered trademarks,
and are used only for identification and explanation without intent to infringe.
Library of Congress Cataloging-in-P ublication Data
Names: Wright, R. Glenn, author.
Title: Ship sensors : conventional, unmanned and
autonomous / R. Glenn Wright.
Description: New York, NY : Routledge, 2024. |
Includes bibliographical references and index.
Identifiers: LCCN 2023042461 | ISBN 9781032456218 (hbk) |
ISBN 9781032456225 (pbk) | ISBN 9781003377900 (ebk)
Subjects: LCSH: Ships–Electronic equipment. |
Intelligent sensors. | Sensor networks.
Classification: LCC VM480 .W75 2024 |
DDC 623.8/504–dc23/eng/20231108
LC record available at https://lccn.loc.gov/2023042461
ISBN: 978-1 -0 32-4 5621-8 (hbk)
ISBN: 978-1 -0 32-4 5622-5 (pbk)
ISBN: 978-1 -0 03-3 7790-0 (ebk)
DOI: 10.1201/9 781003377900
Typeset in Sabon
by Newgen Publishing UK
Contents
2 Engineering Sensors 11
2.1 Ship Function, Health and Performance 11
2.2 Types, Functions and Applications 12
2.3 Performance Parameters 14
2.4 Engineering Sensors 15
2.4.1 Fluid and Bulk Level 15
2.4.2 Fluid and Gas Flow 16
2.4.3 Pressure 17
2.4.4 Temperature 18
2.4.5 Humidity 18
2.4.6 Position 19
vi Contents
2.4.7 Vibration 20
2.4.8 Gases 21
2.4.9 Fire 21
2.4.10 Microphones 22
2.4.11 Video Cameras 22
2.5 Sensor Interfaces, Controls and Panels 23
2.5.1 Cables and Connectivity 23
2.5.2 Power Supplies 24
2.5.3 Signal Conditioning 24
2.5.4 Displays and Computers 26
2.6 Sensor Relationships and Engineering System
Automation 27
3 Navigation Sensors 28
3.1 Sensor Use in Navigation 29
3.2 Overview of Navigation Sensor Functionality 30
3.3 Types of Navigation 33
3.3.1 Satellite Navigation 33
3.3.2 Radar Navigation 36
3.3.3 Piloting 41
3.3.4 Radio Navigation 43
3.3.5 Dead Reckoning 45
3.3.6 Celestial Navigation 47
3.3.7 Weather Instrument Use in Navigation 47
3.3.8 Other Sensor Data 48
3.4 The IMO e-Navigation Initiative 49
4 Cargo Sensors 52
4.1 Ships and Cargos 53
4.2 Containerized Cargo 54
4.2.1 Container Hold Sensors 54
4.2.2 Shipping Container Sensors 55
4.2.3 Parametric Rolling 56
4.3 Bulk Cargo 57
4.3.1 Cargo Liquefaction 57
4.3.2 Explosion 58
4.3.3 Unpreparedness to Fight Fires 59
4.4 Liquid Tank Cargo 59
4.4.1 Cargo Tank Overheating 59
Contents vii
Index 236
Figures
Ship sensors are being viewed from a new perspective that considers the
fusion of their data, capabilities and performance through a comprehensive
approach that sees every aspect as a cohesive entity rather than a collec-
tion of isolated sensors and systems. Driven by revolutionary technological
advances by vessel autonomy initiatives, new sensor capabilities can enhance
seafarer awareness of vessel status, identify trends in vessel performance
and the adjacent environment that improve potential for voyage success and
detect conditions that hinder and may jeopardize successful voyage comple-
tion. Combined with artificial intelligence, advanced computing architectures
and enhanced communications, the management and control of shipboard
systems can achieve broad new capabilities to improve safety for tradition-
ally staffed vessels and implement new strategies to achieve full independence
for autonomous vehicles.
This book on ship sensors logically follows the author’s previous book,
Unmanned and Autonomous Ships, providing a detailed look at the various
organs and appendages comprising a vessel’s architecture that makes possible
partial and even full automation of various onboard functions. However,
sensors are increasingly prevalent throughout all levels and functions of con-
ventionally staffed ships, making this topic vitally relevant for all vessels. The
ultimate goal is to provide a window into the present and future of sensor
architectures and configurations that can enhance vessel performance, safety
and further improve the safety of navigation. In an attempt to eliminate
potential confusion, unmanned and autonomous watercraft of all types and
sizes are referred to as “vehicles” to distinguish them from the traditional use
of the word “ships” throughout the chapters of this book. The word “ves-
sels” can apply to both ships and vehicles.
Topics covered include traditional and expanded sensor functions in engin-
eering and navigation as well as new sensor capabilities integrated into occu-
pied ship’s spaces and cargos that can provide greater insight into vessel
behavior and performance and enhance awareness of passenger, crew and
other human activities. Background is provided regarding typical legacy
vessel sensor configurations and current IMO requirements for onboard
xvi Preface
sensors and future regulatory trends. Discussed also are modern sensors
and current-generation “smart” sensors that provide enhanced situational
awareness to watchstanders. A vision of next-generation sensors currently
being investigated for shipboard use is provided along with long-term trends
in quantum sensing and computing that promise radical, revolutionary and
game-changing alterations across a wide variety of vessel functions. A topic
unique to this discussion is that of sensor degradation in terms of what it is,
how it may be detected and the potential for overcoming the many limita-
tions that result from its occurrence.
Additional topics include the means by which sensor data communica-
tion is accomplished and how data fusion can achieve far greater promise
to enhance situational awareness and safety of navigation than the sum of
the individual sensors themselves. Insight is provided concerning cybersecu-
rity essential to all sensor systems, the means by which sensor data can be
protected and how confidence can be assured in the information they pro-
vide across an operational environment. Also discussed is how sensor system
design can be verified with respect to the requirements they are intended to
fulfill, validated to ensure they actually accomplish these requirements and
tested to determine their resilience and robustness under a wide variety of
conditions.
Two definitions are offered to help provide context for the discussions in
this book:
Smart Sensor2
Notes
1 www.merriam-webster.com/dictionary/automaton
2 www.pcmag.com/encyclopedia/term/smart-sensor
Acknowledgments
This book describes a ship’s senses of vision, hearing, smell, touch and taste,
which are vital to how a vessel and her crew, human and cyber, interpret and
react to things that are going on within and to the external world outside the
hull, superstructure and on deck. Events taking place may aid or hinder the
goal of planning, undertaking and completing a voyage. Watchstanders on
the bridge ensure the ship can make way and doesn’t bump into things en
route. Crewmembers in the engineering department are entrusted with pro-
pelling the ship through the water as well as operating and maintaining the
various onboard systems that make the ship come to life. Deckhands ensure
the safe and efficient loading, overseeing, and unloading of cargos and any
passengers that may be along for the ride; except when the number of pas-
sengers begins to exceed crew sizes, where hospitality staff are entrusted with
the duties of passenger safekeeping and morale.
To illustrate the roles played by human beings and ship sensors, two short
stories depicting different points of view of the same sequence of events are
told. The first describes an observer who assigns their own thoughts and feel-
ings to events as they unfold. The second describes the underlying processes
to interpret these events and make appropriate decisions and take action.
One is very human, the other a matter of fact.
Doi: 10.1201/9781003377900-1
2 Introduction to Ship Sensors
glow appeared in the fog and gradually her sleek lines were revealed.
White, then red shown through the silver apparition.
I knew her name, where she was going and where she was from.
She moved with purpose intent on making her way, knowing I was
there but she would not stay.
Cold and indifferent, no crew is aboard. Emotionlessness without feel-
ings, but nothing untoward.
Neither a ghost from the past, nor anything old. Just an automaton of
today. A ship without a soul.
This, of course, is a story of two ships passing in the night except with a
modern twist where one vessel is unmanned and autonomous, operating
and navigating using her own resources. The ship’s presence and identity
are announced to others using Automated Identification System (AIS), with
her every move choreographed by an automated captain viewing the world
through electronic eyes, ears and other senses based on the knowledge of
what has been learned from terabytes of training data. However, none of
this is possible without conventional and unmanned vessels being laden
with sensors to provide full-time situational awareness across all engineering
functions throughout the ship and navigation functions below, on and above
the sea during the voyage. Both vessels in this story have benefitted from
advances in ship automation. It is hoped that the contents of this book help
to bring these concepts into view in a logical manner.
6
Introduction to Ship Sensors
Frequency Range Frequency Units Sub-B and Maritime Navigation Sensors
RADIO 30–3 00 kHz Low Frequency (LF) Human hearing: Sonar, Acoustic underwater
20 Hz–2 0 kHz communication,
Engineering, eLoran
300 kHz–3 MHz Medium Frequency (MF) MF SSB, GMDSS, Sonar, RDF,
NAVTEX, Engineering
3–3 0 MHz High Frequency (HF) HF SSB communications,
GMDSS, RDF
30–3 00 MHz Very-H igh Frequency (VHF) VHF communications, EPIRB,
VTS, GMDSS, AIS, RDF,
APT
200–2 50 MHz G-b and Few maritime services,
possibly Radar
250–5 00 MHz P-b and EPIRB Satellite Signal, SAR
300 MHz–3 GHz Ultra-H igh Frequency (UHF) 500–1 ,500 MHz L-b and GNSS, Mobile Phone,
Inmarsat, Weather, WAN,
Cargo, EPIRB
2–4 GHz S-b and GNSS, Satellites, Weather,
Microwave, Radars, LAN,
Bluetooth
3–3 0 GHz Super-H igh Frequency (SHF) 4–8 GHz C-b and Communication and Weather
satellites, Wireless, Cargo
8–1 2 GHz X-b and Meteorological, Earth
Observation/C omm. Sats,
Radar, SART
12–1 8 GHz Ku-b and Communication and Weather
satellites
18–2 7 GHz K-b and Communication satellites,
mmRadar
30–3 00 GHz Extreme-H igh Frequency (EHF) 27–4 0 GHz Ka-b and Communication and Weather
satellites
40–7 5 GHz V-b and Radars, Communication
satellites
75–1 11 GHz W-b and mmRadar
OPTICAL 300 GHz–4 00 THz Infrared Light (IR) Infrared Sensors and
Cameras, Lidar
400–7 50 THz Human Visible Light (VIS) Cameras
Source:
ITU 2015. Nomenclature of the frequency and wavelength bands used in telecommunications. Recommendation ITU-R V.431-8 (08/2 015).
International Telecommunications Union, Geneva.
References
1 Robert Sheldon. What Is a Sensor? WhatIs.com, TechTarget. August 2022.
www.techtarget.com/whatis/ definition/sensor
2 Tereza Pultarova. Why the James Webb Space Telescope’s Amazing ‘Pillars of
Creation’ Photo Has Astronomers Buzzing. Space.com. 24 October 2022. www.
space.com/james-webb-space-telescope-pillars-creation-excites-astronomers
3 Dennis Hung. The Evolution of Medical Imaging and Its Impact on the
Healthcare System. www.healthworkscollective.com/evolution-medical-imag
ing-impact-healthcare-system/
4 Yuki Takechi-Haraya, Yukihiro Goda, and Kumiko Sakai-Kato. Imaging and
size measurement of nanoparticles in aqueous medium by use of atomic force
microscopy. Anal Bioanal Chem. 2018 Feb;410(5):1525–1531. doi: 10.1007/
s00216-017-0799-3. Epub 18 December 2017.
5 International Maritime Organization. Resolution A.1149(32) Adopted on 15
December 2021 (Agenda item 9(a)) Revised Strategic Plan for the Organization
for the Six-Year Period 2018 to 2023. A 32/Res.1149 28 January 2022.
10 Introduction to Ship Sensors
Engineering Sensors
Doi: 10.1201/9781003377900-2
12 Engineering Sensors
oil as well as monitoring exhaust gas and cylinder liners. Significant advances
have also been made using new technologies for level sensing, water ingress,
pressure monitoring, winch torque and load monitoring, and many other
engineering sensing applications.
In addition to the above, other inputs from the engineering environment
include motion, moisture, torque, velocity, acceleration, pressure, audio, and
video. These inputs are often represented as a function of range, intensity,
volume, resistance, conductance, and other characteristics of mechanical and/
or electrical (analog and/or digital) signals created by sensors and sent dir-
ectly to displays or indirectly using one or more data communication buses.
Such functions form the underpinnings of engineering sensors for onboard
systems that indicate system performance characteristics such as vacuum,
quantity, liquid and gas flow, voltage and electrical current flow. The wide
range of measured parameters and characteristics of data obtained from sen-
sors used in engineering applications prohibit detailed discussion short of a
full text dedicated exclusively to this task. However, generalizations can be
made based on sensor function.
Static and dynamic parameters are often a reflection of the quality of the
sensor. Sensors can also be classified as being either analog or digital. Analog
sensors produce an output that is continuous over time in proportion to the
input, whereas digital sensors produce pulses or digital words corresponding
to the input. Analog and digital sensors, where the output alternates between
two fixed levels, are binary sensors that effectively function as on-off switches.
Analog signals are usually converted to digital for transmission and analysis.
In addition to cost, the key considerations for determining the proper type
of level sensor to be used in any application include the characteristics of
16 Engineering Sensors
• Laser Doppler: This sensor uses a continuous wave laser combined with
a laser detector to determine the change in wavelength caused by the fluid
or gas being measured as a velocity function.
• Mass: The flow rate of gas or fluid mass is often measured to determine
fuel-to-air ratios for proper combustion.
• Vortex: Flow measurements are based upon the creation rate of eddies
and vortexes in proportion to the velocity of gas or fluid flow passing
around a fixed object.
• Turbine: The gas or liquid flow rate through a propeller or turbine will
cause it to turn or spin at a speed directly proportional to the flow.
Key considerations for determining the proper type of flow sensor include
the characteristics of the gas or liquid to be measured in terms of purity, vis-
cosity, density, liquidity or slurry, the pressures and temperatures at which
they operate, and the properties and characteristics of the path in which the
sensor is placed and operates.
2.4.3 Pressure
Pressure is measured by converting physical energy exerted against a surface
into an electrical signal that is proportional to the force. Various types of
pressure sensors exist using different technologies to make measurements.
Pressure measurements are taken at various locations on a ship in support
of functions such as engine fuel and lubricating oil, manifold vacuum and
coolant; potable water, firefighting equipment, hull strain and fatigue,3,4 shaft
torque, and winch loads. Sensors commonly found on ships that are used to
measure pressure include:
2.4.4 Temperature
Temperature is an expression of how hot or cold something is compared
to a scale. Temperature measurements are made throughout a ship include
engine coolant, cargo, exhaust, passenger and crew quarters, and food freez-
ers. Both contact and noncontact measurements may be taken. Examples of
different types of temperature sensors include:
2.4.5 Humidity
Humidity represents the concentration of water vapor that exists within a
gas. Absolute humidity refers to the actual amount of water present in terms
Engineering Sensors 19
2.4.6 Position
This term is defined within the context of engineering departments on ships
as being the location of something in a particular space or relation to another
object or reference. Position sensors are designed to detect movement and
convert this into signals for transmission, reporting, or control. Examples
include rudder position to port or starboard of the ship’s centerline; throt-
tle being open, closed, or somewhere in between; and the detection of limits
such as high vs. low, on vs. off, up vs. down, open vs. closed, etc. They can be
divided into linear, rotary, and angular categories. Several different types of
position sensors are likely to be installed throughout most ships and vessels.
Common types of position sensors include:
2.4.7 Vibration
Monitoring machinery vibration is an essential element of detecting failures
and predicting future component failures. Vibration often stems from mis-
alignment, imbalance, and wear on engines, motors, couplings, pumps, and
other machinery. Measurement is accomplished mostly through the use of
strain gauges and piezoelectric accelerometers that can sense back-and-forth
motion. Vibration can be considered as being of two types:
• Axial: Vibration that is in line with the thrust projected from the
machinery.
• Radial: Vibration that radiates perpendicular to the line of thrust.
2.4.8 Gases
Exposure to hazardous gases and deficiencies in essential gases in confined
spaces onboard ships has gained significant attention in the press and by
regulatory authorities.6,7,8 Hazard sources include leaking gases and liquids,
cleaning fluids, combustion (e.g., welding), dust concentrations and decom-
posing organic matter. Among the most significant gases monitored are:
• Oxygen (O2): Depletion of oxygen, especially in cargo holds, has been the
cause of many deaths.
• Carbon Monoxide (CO): Caused by the accumulation of exhaust gases
from combustion.
• Hydrogen Sulfide (H2S): Usually caused by leakage of blackwater tanks.
• Lower Explosive Limit (LEL): Combustible gases such as methane and
natural gas.
Many permanent installation and portable gas detectors are available for
detecting these and other gas hazards onboard ships. Multiple gases are often
detectable using the same device.
2.4.9 Fire
Fires on board ships are one of the most dreaded occurrences, costing many
lives and billions of dollars in damage and losses annually. Many nations have
adopted IMO regulations under the Safety of Life at Sea (SOLAS) conven-
tion that specifically address shipboard fire detection capabilities.9 All ships
are subject to fire, but car-carrying vessels, in particular, have experienced
even greater exposure as a result of carrying vehicles that contain not only
combustible fuel but also lithium-ion batteries that are subject to seemingly
spontaneous combustion.10 Three different types of sensors are available that
are capable of detecting the characteristics of fire. These include:
Smoke is the most likely killer of people as toxic fumes may be emitted
over a period of hours before the presence of flame or heat can be detected.
Smoke detectors use a photo-electric sensor to visually detect particulates
or a radioactive isotope as an ionization source to measure electrical con-
ductivity between particles. Flame detectors use optical sensors to rapidly
detect ultraviolet and/or infrared emissions associated with fire. Heat detec-
tors are temperature sensors calibrated to a threshold that corresponds to a
probability of fire. They are particularly useful in environments laden with
dust and where high moisture levels exist. Many fire detectors contain two
or more combinations of sensors. In addition to spaces where fire detection
equipment is mandatory, car-carrying vessels are also adding networks of fire
detectors throughout their cargo decks to detect fires with greater specificity
as to location to further speed response times and fire suppression efforts.
2.4.10 Microphones
Microphones installed in the engine room and in other locations that are
usually unoccupied can provide great insight into adverse situations as they
develop. Easily detected using a simple live microphone connected to a remote
speaker are vibrations, bearing squeals, clanking noises and other audible
clues to actual or pending equipment malfunctions with which engineering
staff are already familiar but may otherwise go unnoticed in the absence of
staff within these spaces.
Sound monitoring devices are also available that can continuously monitor
and record vibration, sound and noise levels, and air overpressure. These
devices may be equipped with extended dynamic range microphones that
can detect sounds outside the human range of hearing, are water-resistant
and ruggedized to withstand harsh conditions. Monitoring can be performed
continuously for days and weeks at a time, and thousands of events may
be recorded when equipped with sufficient memory. Some such devices are
standalone units equipped with internal batteries, while others are operated
under the ship’s power with battery backup that is network-connected to
supplement other alarms and system capabilities.
Some cameras have extended bandwidth capabilities to cover both the visible
and infrared spectrums, or two cameras with complementary capabilities can
be installed. This can be advantageous for identifying machinery that is run-
ning hot. Video monitoring can be performed continuously and recorded for
days or weeks at a time over many events, usually at a location remote from
the cameras connected via a network.
• Cables and Connectivity: The channels and conduits through which sen-
sor data is communicated.
• Power Supplies: Delivery of electricity to provide power to sensors and
sensor networks.
• Signal Conditioning: Amplification, filtering, level changing and digitiza-
tion of sensor signals.
• Displays and Computers: Destination for sensor data processing, alarm
generation and user interfaces.
between the two dividers. When the balance between the dividers is the
same there is no voltage difference. However, when the resistance of the
sensor changes, the output voltage changes as a function of the measured
value. In most cases, the magnitude of the voltage is small and requires amp-
lification. This analog voltage needs to be converted into digital format to
be useful. Bridge circuits generally feature low power consumption in the
microamp (µA) to milliamp (mA) range and are often integrated directly
into the sensor.
Another example is capacitance sensors, where capacitance values are con-
verted directly into digital words at the sensor using an integrated circuit
(IC) with correction and compensation for offset, sensitivity and drift made
according to the specific characteristics of the sensor itself. This applies to
single capacitive sensors where both terminals are accessible and differential
capacitive sensors. Capacitance sensors also feature low power consumption
in the microamp (µA) to milliamp (mA) range.
In many cases, the signals generated by the sensor itself and available at its
output terminals are not directly compatible with circuit path characteristics
to transmit sensor data, nor are they at the proper levels or in the correct
format. It is, therefore, necessary to further process sensor signals for them to
be useful. Several more common types of sensor signal processing techniques
are listed in the following paragraphs:11
References
1 AllICData. What Are the Static and Dynamic Characteristics of the Sensor?
allicdata.com. Updated 21 November 2021. www.allicdata.com/news/sensor/
what-are-the-static-and-dynamic-characteristics-of-the-sensor.html
2 Tom Kenny. Sensor Technology Handbook. Chapter 1, Sensor Fundamentals.
Elsevier, Oxford, UK. 2005. Edited by Wilson, Jon S., pp. 2–5.
3 Sudripto Khasnabis. Different Technologies to Measure Hull Stresses in
Ships. Marine Insight, Naval Architecture. 16 September 2019. www.marine
insight.com/naval-architecture/different-technologies-to-measure-hull-stres
ses-in-ships/
4 Y. Takaoka, K. Nihei, P. Vargas, P. Aalberts, and M.L. Kaminski. Application
of Fatigue Damage Sensors in the Monitas System. Offshore Technology
Conference (OTC), Houston, TX, USA. 30 April 2010. https://doi.org/10.4043/
20870-MS
5 ByJu’s.com. Unit of Vibration. https://byjus.com/physics/unit-of-vibration/
#vibration-units
6 Maritime Executive. Two Killed in Gas Leak Aboard Chinese Bulker. 23 April
2018. https://maritime-executive.com/article/two-killed-in-gas-leak-aboard-
chinese-bulker
7 Maritime Executive. Police Inspect Sydney Harbor Cruise Boats After Passenger
Fatality. 7 February 7, 2019. www.maritime-executive.com/article/police-insp
ect-sydney-harbor-cruise-boats-after-passenger-fatality
8 Cargo and Cargo Hold Ventilation. INTERCARGO, the Standard Club and
DNV- GL. January 2021. www.dnv.com/maritime/publications/Cargo-and-
Cargo-Hold-ventilation-guidance-download.html
9 IMO. Part IV. SOLAS Chapter II- 2. Construction –Fire Protection, Fire
Detection and Fire Extinction. International Maritime Organization. London.
10 Valdes-Dapena, Peter. Burned Ship Carrying Luxury Cars Has Now Sunk.
CNN Business. 2 March 2022. www.cnn.com/2022/03/02/business/felicity-ace-
car-ship-sunk/index.html
11 David Ashlock and Anjelica Warren. The Engineer’s Guide to Signal
Conditioning. National Instruments, 2015. https://download.ni.com/evaluat
ion/signal_conditioning/20712_Benefits_of_Integrated_SC_WP_HL.pdf
Chapter 3
Navigation Sensors
Many sensors are dedicated to different aspects of the safe and efficient move-
ment of ships through perilous waters and the avoidance of other vessels and
hazards to navigation while en route to their destinations. Biological sensors
provide human vision, hearing, touch, smell and other abilities that have
formed the core ship navigation capabilities for millennia and continue to
do so, with the human brain the ultimate destination for sensor fusion and
interpretation. However, the inventions of mechanical and electronic sensors
have vastly expanded the range, scope, resolution and accuracy of the human
senses to better assist mariners in performing essential pilotage, dead reckon-
ing, celestial and other navigation techniques to locate position, determine
time, steer a proper and safe course and continuously monitor the voyage
through completion. Examples of invention include the ancient astrolabe,
back staff, chip log, leaded line and sextant right up to modern electronic
sensors such as Radar, Sonar, Global Navigation Satellite Systems (GNSS),
autopilots, and weather instruments so vital to planning and executing a voy-
age. This has also led to secondary uses of time-delayed sensor data found in
nautical charts, Electronic Chart Display and Information Systems (ECDIS),
Notices to Mariners and Coast Pilots.1,2,3,4 The data these products contain
have been compiled in the past from many diverse sensor technologies and
multiple sensors of the same kind including single-beam and multi-beam
Sonars, Lidar, aerial and satellite imagery.
Marine navigation has been described as a blend of both science and art,
and that science can be taught but art must be learned from experience.5
Ship sensors illustrate how science can improve situational awareness and
greatly enhance navigation safety. The principles of what ship sensors can
do and how they function have been the basis for many courses of study and
practical demonstrations. It is also the focus of research in machine learn-
ing that attempts to replicate, improve and accelerate human methods for
viewing and interpreting sensor data. However, the art of marine naviga-
tion stems from prudent discernment of sensor limitations considering the
Doi: 10.1201/9781003377900-3
Navigation Sensors 29
types of vessels on which mariners serve and encounter along their voyages,
and the environment through which they must pass. Such knowledge takes
years to acquire and master. It is this art of navigation that the developers
of deep-learning Artificial Intelligence (AI) technologies are attempting to
master through the fusion of multiple sources of sensor data combined with
vast resources portraying vessel and human behavior.
Sensor technologies continue to adapt to new navigation applications, and
the evolution of advanced sensor capabilities promotes the invention of new
and enhanced methods to navigate. Historical examples include the replace-
ment of the leaded line with Sonar, bells with high-power air and electric
horns, and speed logs and Long-Range Radio Navigation (LORAN) with
GNSS. The use of electronic sensors in vessel navigation is expanding across
all maritime frontiers, whether they advise crewmembers on conventionally
staffed ships or guide and control all navigation functions on autonomous
vehicles.
This chapter proceeds from the perspective of how ship navigation is
achieved with the benefit of sensors rather than focusing solely on the sen-
sors themselves, although details on sensors are indispensable to their proper
understanding. A review of the different types of ship navigation establishes
a historical basis for how these inventions have been and continue to be
adopted. This includes a discussion of methods developed over the centuries
to perform navigation, emphasizing the contributions of sensors toward
extending the human senses to accomplish these tasks. Finally, an overview
is presented of the many sensors commonly found on ships, along with a
preview of future navigation sensors and sensor technologies looming on the
horizon.
Every vessel shall at all times maintain a proper look-out by sight and
hearing as well as by all available means appropriate in the prevailing cir-
cumstances and conditions so as to make a full appraisal of the situation
and of the risk of collision.
In practice, the lookout task has traditionally been performed by one or more
watchstanders posted at all times on the bridge and, weather permitting,
stationed forward in restricted visibility conditions. They are tasked with
reporting on lights, vessels and small craft, landmarks, large floating objects
and marine mammals, indications of shoal waters, hazards to navigation
and fog signals likely to herald the risk of collision. Efforts to further vessel
autonomy find increasing assistance being rendered to bridge watchstanders
by independent electronic bridge aids to monitor sensor data, detect potential
hazards and suggest possible courses of action to be taken. Such aids satisfy
the need for sight and hearing and all appropriate means to fully assess the
situation.
Information obtained from navigation sensors are interconnected using
multiple workstations, providing overlapping and redundant capabilities
through an Integrated Navigation System (INS). Such systems are a con-
venient means for seafarers to prioritize and display navigation informa-
tion in an organized manner and check the validity, consistency, latency and
integrity of the data. The products of these displays are used to assist in
route planning and monitoring, collision avoidance, alarm monitoring and
to enhance overall navigation situational awareness.
Since approximately 1995 navigation sensors have witnessed the devel-
opment and introduction of new instruments and technologies as diverse
as Digital Select Calling (DSC) in Very High Frequency (VHF) and High
Frequency (HF) radio, Navigational Text Messages (NAVTEX), enhanced
Long- Range Radio Navigation (eLORAN), Forward- looking Navigation
Sonar (FLS), the GNSS, Global Maritime Distress Safety System (GMDSS),
AIS and Satellite AIS (S-AIS), digital Radar/ARPA, multibeam Sonar, broad-
band satellite communications, ECDIS and the Internet.7 The rate of improve-
ment in existing sensors, the invention of new sensor technologies and their
augmentation to achieve new functionality through the use of AI continues
to increase at an accelerated pace.
and other military vessels also generally have the same navigation equip-
ment as civilian vessels and often comply with the IMO Safety of Life at Sea
Convention (SOLAS) requirements. It should be noted that military vessels
are not necessarily required to operate in the same manner as civilian vessels
in terms of compliance with COLREGS and SOLAS and often do not regu-
larly utilize equipment such as GMDSS and AIS.
Navigation Sensors 33
GNSS operates primarily in the L-band (1–2 GHz) and S-band (2–4 GHz)
frequencies. The method used in the operation of GNSS includes the trans-
mission of signals from various satellites in each constellation, with each
signal containing a message with precise satellite location and current time
information. Position is determined through triangulation by measuring the
time it takes for signals to travel from multiple satellites to the ship’s onboard
receiver. Modern GNSS receivers often receive signals independently from
several different GNSS systems to improve reliability in the event problems
exist with one system, and to enhance accuracy.
Continuous PNT information is provided directly to the ship’s navigation,
communications and other onboard systems through constant communica-
tion with satellites to enable tracking of vessel movements and determine
velocity and direction. Projection of future positions based on the vessel’s
current trajectory helps to determine adherence to planned routing and to
detect and avoid potential hazards to navigation that may include other ves-
sels, shoals and insufficient depths, and potential conflicts with charted and
sensed objects. GNSS also provides onboard VHF and High Frequency (HF)
radio transmitters and satellite terminals with position data to be communi-
cated in the event of an emergency to help speed rescue efforts, and many sys-
tems offer features to mark waypoints that assist in man-overboard recovery
situations. Capabilities to provide messaging for emergency warning services
are being considered for the Galileo, IRNSS, and QZSS constellations.9 This
includes features for danger notification and subsequent instruction to escape
from disaster situations. Several commercially available communication
devices also locate positions using GNSS and then connect to other satellites
to send and receive messages.10
Many limitations exist in the implementation of GNSS technology that
reduces its usefulness and allows it to be exploited for nefarious purposes by
Navigation Sensors 35
nation states and individuals. One of its main flaws is that it operates using
relatively weak signals from space that can easily be overridden by spoof-
ing that displays false position information to users and can lead vessels
to stray into shoal waters or directly into the hands of pirates and adver-
saries. These weak signals can also be interfered with through jamming
resulting in denial of service (DOS) attacks that render it useless. GNSS
effectiveness can also be limited due to signal path obstruction between
the satellite and antenna by ship structures, and multipath interference due
to reflected signals from the many metal surfaces on ships that can lead
to errors in position, speed, or velocity calculations. Accuracy can also be
affected by sunspots and space weather events that influence atmospheric
conditions, especially with the ionosphere. It can also be adversely affected
by severe weather conditions such as heavy precipitation, dense cloud cover
and sometimes even fog. Satellite geometry, where satellite configurations
are altered as parts of the constellation go out of service, can also adversely
affect accuracy.
An additional potential limitation of GNSS services has to do with the
ability of national operators to selectively degrade or dither signals to reduce
positioning accuracy to nonmilitary users. This was called “selective avail-
ability” for the United States GPS, where autonomous horizontal position-
ing accuracy was advertised to be no worse than 100 meters 95 percent of
the time. This function was turned off in 2000 and officially discontinued in
2007 after the U.S. DoD demonstrated the ability to selectively deny GPS
signals on a regional basis as needed in a military area of operations when
U.S. national security is threatened.11 Such capabilities are likely retained for
other nation-state controllers of GNSS satellite constellations.
It should be noted that Differential GPS (DGPS) services can be used where
further signal augmentation is necessary to achieve centimeter-level or sub-
meter-level accuracy. With one receiver established at a precisely known loca-
tion as a reference, another roaming receiver calculates its position based
on satellite signals and compares this location to the known location.12 The
difference is applied to the roaming GPS receiver in real-time in the field using
radio signals. Differential techniques may be used with other GNSS systems
as well.
Over 1,600 lives were at risk, 51 lives were lost and Andrea Doria was
sunk.14
Radar system designs have changed over the years making them more
accurate with greater range and resolution, providing more information than
in previous generations, and easier to use and interpret. Also being intro-
duced are tools such as Doppler technology that instantly shows whether a
target is approaching or moving away, and ARPA that can create tracks and
calculate the target course, speed and closest point of approach. New deriva-
tives of Radar technology are also being introduced in the form of Synthetic
Aperture Radar (SAR), millimeter Radar (mmRadar), and Light Imaging
Detection and Ranging (Lidar) that promise new levels of situational aware-
ness for not only traditionally staffed ships but also unmanned and autono-
mous vehicles. The following paragraphs describe some of the key technical
aspects of Radar-based sensors and their use in modern vessel navigation.
Early Radar systems produced high-power continuous wave signals that fea-
ture combinations of amplitude modulation (AM), frequency modulation
(FM), and pulse-width modulation (PWM) characteristics to detect and locate
targets. Distance is measured by determining the time required for the signal
to travel from the ship’s transmitter to a target and then return to the ship in
terms of the amplitude of the reflected signal being proportional to distance
with variation in frequency to distinguish between targets. Pulse width that
varies with reflected signal strength is used to measure target speed.
The range and resolution of the signal are dependent in part on the trans-
mission frequency, with lower frequencies generally resulting in higher range
and lower resolution than higher frequency transmissions. Radar signals are
highly directional, thereby enabling bearing measurements based on the dir-
ection from which they were transmitted. Signal pulse width helps to deter-
mine resolution, and the higher the frequency at which the pulses repeat,
the better it can discern faster-moving targets. Phase shift and Doppler shift
characteristics detected in the returned signal also help to determine target
speed and direction.
Analog Radar systems were prevalent in early maritime applications and
are generally being phased out in favor of modern digital technology that uses
much less power at reduced radiation levels to achieve greater performance.
The heart of analog and early digital Radar systems is a klystron vacuum
tube used as a signal amplifier. The processing of returning analog signals
reflected off-targets is accomplished by converting them into electronic sig-
nals compatible with filters and mixers, whereby various signal characteris-
tics are examined to reveal target distance, bearing and speed. Results are
displayed on an electronic screen using symbols and characteristics appro-
priate for the target.
38 Navigation Sensors
Digital Radar technology can generate complex signals and transmit mul-
tiple pulse widths where the returned signals are sampled, converted to digital
signals and analyzed using a wide array of digital signal processing techniques
that provide far greater control and manipulation of data than analog tech-
niques. This results in greater accuracy and resolution than analog systems
and new insight not available using analog signal processing to better detect
and classify moving targets and to distinguish them from sea clutter, atmos-
pheric effects and stationary backgrounds. Modern digital Radar systems are
fully solid state, eliminating the need for a klystron vacuum tube amplifier.
They are also less susceptible to interference due to better available filter-
ing techniques. Digital Radar technology can provide features not available
using analog technology, including SAR that can deliver higher spatial target
resolution than conventional digital Radar systems.15,16 Additional function-
ality can also be provided whereby, with each sweep of the Radar antenna,
dual progressive scan transmissions may be transmitted and processed to
simultaneously display two separate Radar ranges that can be manipulated
separately. Additional capability is also available to overlay information
from other systems, such as AIS, onto the Radar display, further enhancing
situational awareness.
Many early marine Radar systems operated in the L-band (500 MHz–1.5
GHz) frequency range that provided a good mix of range and resolution con-
sidering the technological state of the art of the time. Longer wavelengths are
less susceptible to attenuation due to precipitation, and wider beam widths
cover a larger area than progressively shorter wavelengths found at higher
frequencies. As technology progressed, successively higher transmission fre-
quencies were adopted offering greater resolution at the cost of lesser range.
However, the advent of digital Radars helped to extend range as digital signal
processing techniques improved. L-band Radars provided the foundation for
the development of Radar-assisted navigation and collision avoidance tech-
niques currently in existence.
Shorter wavelengths and higher operating frequencies have established
S-band (2–4 GHz) Radars as the workhorse of the maritime industry for
decades. Increasingly lower costs combined with long range on the order of
up to 100 miles, good resolution and a robust ability to penetrate precipita-
tion and other weather events provide a reliable sensor system upon which
regulatory advancements were established to enhance the safety of naviga-
tion worldwide. These are the first systems where automated tools for target
tracking and positioning were introduced.
Today’s state-of-the-art X-band (8–12 GHz) Radar systems operating at
shorter wavelengths and higher frequencies than S-band and L-band systems
provide excellent short-range capability. Combined with very high resolution
Navigation Sensors 39
and good weather penetration, their use is ideal for waterway and port close-
in maneuvering and for search and rescue. They feature many advanced
capabilities including target tracking, identification and future position pre-
diction made possible through enhanced signal processing and sensor fusion
techniques. The simultaneous use of the combination of S-band and X-band
Radars on the bridge to provide complementary perspectives of short and
long-distance vessel and landmark characteristics can achieve greater situ-
ational awareness than was previously possible. X-band Radar systems can
display Search and Rescue Radio Transponder (SART) positions transmitted
from vessels in distress.
The most advanced short-range Radar systems available today operate
in the L-band (24–27 GHz) and have even higher resolution than S-band
and X-band systems. They are ideal for use in densely populated marine
environments, target identification, port and lock maneuvering and collision
avoidance. Due to their short wavelengths inherent to L-band signals, small
targets, buoys and other AtoN are easily detected and identified.
New inventions that include mmRadar and Lidar are being developed for use
by Unmanned Surface Vehicles (USVs) to enhance precise close-in maneuver-
ing capabilities. These technologies are likely to also be adopted for use in
navigation by conventionally crewed vessels in the future as their utility and
benefits are more fully developed.
Operating primarily in the W-band (75–111 GHz), mmRadar has been
used for a variety of military purposes to provide terminal guidance for
missile targeting as an integral part of aircraft and ground vehicle fire con-
trol systems and to assist in aiming automated guns in ship defenses against
incoming missiles. While recognizing limitations associated with their short
wavelength operating frequencies, there are many opportunities for mmRa-
dar use on civilian ships and vehicles. These include collision avoidance with
other vessels and allision avoidance with objects, especially in port navi-
gation, for real-time tracking of small boats in close proximity, for human
in-the-water search and rescue, and ice navigation. The ELVA-1 mmRadar
system for shipboard use is currently being marketed as capable of elimin-
ating close-in blind spots and having a useful range of between 0 and 600 m.17
Lidar utilizes lasers that operate in the higher range of infrared light fre-
quencies up to 400 THz. Its uses mirror those of mmRadar except with
higher resolution due to the shorter wavelengths involved. Experiments have
shown that ship position and heading angle were estimated using Lidar with
higher accuracy than GPS, QZSS and inertial navigation, with a position esti-
mation of approximately 1.2 m.18 The Ouster OS2 Lidar system operates at a
frequency of 345 THz (865 nm) and detects 10% reflective targets at 200 m
with a maximum range beyond 400 m.19
40 Navigation Sensors
This is a requirement that Radar shall be used under all visibility conditions,
day and night. The rule continues with,
This part of the rule specifically acknowledges the limitations of Radar that
make it unable to provide a complete or even an adequate representation
of events. All decision-making should consider Radar as only one source of
information that must be verified through other means such as visual contact,
AIS, and bridge-to-bridge exchange of intentions.
The use of a Radar beacon (Racon) to enhance the presence of physical AtoN
is accomplished by adding a transmitter that, when triggered by a Radar
signal, will transmit a coded reply consisting of a series of dots and dashes to
the interrogating X-band Radar. This reply appears on the Radar display in a
line emanating radially from just beyond the echo of the AtoN. Racons may
be used on both laterally significant and non-laterally significant aids alike;
the Racon signal itself is for identification purposes only and therefore carries
no lateral significance. They may also be used as bridge marks to mark the
best point of passage.
3.3.3 Piloting
The term “piloting” refers to the navigation of a vessel through restricted
waters. This involves one or more properly credentialled crewmembers, and
sometimes a pilot knowledgeable of local conditions, present on the bridge
performing the required tasks to achieve a successful transit. Integral to this
process is proficient knowledge of sensor operation, interpreting and applying
sensor data, and the real-time ability to combine and compare clues obtained
through visual monitoring of the physical environment with multiple sensor
readings to achieve and maintain comprehensive situational awareness.
The vast majority of sensors available on the bridge can assist in pilot-
ing navigation. Radar as a significant contributor to effective piloting has
already been discussed in the previous paragraph. Other major contributing
sensor sources include time-delayed data in nautical charts and ECDIS, and
real-time data sources, including different types of Sonars, compass heading
and bearing indicators, two-way audio hailers and even weather instruments.
Of course, the human eyes, ears and other senses combined with seafarer
training, experience and judgment are indispensable to successful piloting
navigation. Unmanned and autonomous vehicles have additional sensors,
supplemented by powerful computers containing machine learning and deep-
learning AI-based firmware and software that attempt to replicate seafarer
presence on board. Despite industry assertions to the contrary, as of 2024,
the practical realization of such technologies is still years away into the next
decade for general use and is only suitable now and in the immediate future
for demonstration projects.
42 Navigation Sensors
Once underway, the human senses supplemented with binoculars and other
physical and electronic aids such as infrared (IR) and low-light cameras
remain the standard for visual piloting navigation of conventionally staffed
ships. Of specific interest to help accomplish this task are lighthouses, water
towers, piers, terrain and any prominent objects or features displayed on
navigation charts and contained within ECDIS to be used as AtoN that sea-
farers may correlate with visual sightings by taking compass bearings while
en route. This includes critical buoys and lights whose characteristics may
be verified as corresponding to their Light List entries, especially at night.
The positioning of all visual AtoN should be cross-checked using Radar and
GNSS whenever possible.
In the case of unmanned and autonomous vehicles, visible, low-light and
infrared cameras are combined with microphones to help achieve situational
awareness by sight and hearing through 360 degrees. However, seeing and
hearing alone is insufficient without proper knowledge of what is being seen
and heard. Appropriately trained and credentialed mariners remotely con-
trolling unmanned vessels are constrained by and must rely entirely upon
the perspective provided by shipboard sensors regarding the field of view,
image and audio resolution and fidelity, signal path latency and other factors
for decision-making. For example, a mariner having a lifetime of experience
viewing the world through two eyes must adjust to a world where six eyes
Navigation Sensors 43
are the norm, assuming six cameras with a 60-degree field of view evenly dis-
tributed through 360 degrees, where each camera approximates a mariner’s
normal field of view. Fewer cameras could be used, but not without some
loss of directionality and resolution, along with a distorted view to which the
mariner is unaccustomed.
The Omega, Decca, and Loran C radio navigation systems began in the
1940s and 1950s and were based on using hyperbolic navigation principles
that measured phase differences between different signals to generate lines
of position from which a navigator could determine the position on a chart.
Each system used different frequencies, with Omega operating on the Very
Low Frequency (VLF) portion of the radio spectrum at 10–14 kHz, Decca
operating on Low Frequency (LF) at 70 kHz and Loran-C operating on LF
and Medium Frequency (MF) between 1.9 and 10 MHz. Each band of fre-
quencies has its own unique signal propagation characteristics and limita-
tions, with lower frequencies generally better able to cover longer distances.
Omega accuracy was about 1,000 m at ranges of up to 10,000 nautical miles,
Decca accuracy was about 100 meters at ranges of up to 400 nautical miles,
and Loran C had an accuracy of about 1 nautical mile at ranges of up to
1,000 nautical miles (1,800 km).24
eLoran has been considered as a possible backup system for GNSS. eLoran
operates like to DGPS in providing differential correction to GNSS in local
areas. It is currently not widely used except for research and testing in the
United States and the United Kingdom, and it is in active use in South Korea.
Testing performed by the General Lighthouse Authorities of the United
Kingdom and Ireland (GLA) with eLoran experienced accuracies of between
8 and 10 m (95%) to seven ports on the east coast of the United Kingdom.25
AIS is used to provide vessel identification and other information for use in
collision avoidance and general maritime domain awareness. Data is commu-
nicated in real time between ships using VHF radio channel 87B (Simplex),
also known as AIS channel 1 (161.975 MHz), and ship to shore for VTS
using VHF channel 88B (Duplex) as AIS channel 2 (162.025 MHz).26,27
Other channels may also be used depending on national and regional regu-
lations. Class A AIS generates output power of 1, 12.5, or 25 watts and is
primarily intended for use by larger vessels of greater than 300 gross tons
(GT) and all passenger vessels. Class B AIS, capable of generating an output
power of 2, 5, or 12 watts, is intended for smaller vessels and recreational
craft use. AIS messages comprised of digital data packets encoded with vessel
name, Maritime Mobile Service Identity (MMSI) number, position, speed,
course, type of vessel, ports of origin and destination, and other naviga-
tional information are continuously broadcast every 2–15 seconds between
messages depending on vessel speed. AIS messages can also be received by
Navigation Sensors 45
AIS can be used to augment the presence of a physical AtoN, such as a buoy,
by providing a corresponding electronic presence at the same location on
Radar and ECDIS broadcasting both laterally (e.g., Port Hand Mark) and
non-laterally significant marine safety information (e.g., environmental data,
tidal information and navigation warnings).28 This is accomplished by fitting
an AIS transmitter broadcasting using VHF radio on AIS channels 1 and 2
directly to the physical AtoN itself, installing one remotely within the line of
sight of the physical AtoN and referred to as a synthetic AtoN, or virtually
with installation in a location within line of sight to where a physical AtoN
should exist but cannot be placed due to inaccessibility, consistently rough
weather conditions or other environmental factors and referred to as a vir-
tual AtoN. AIS AtoN can broadcast autonomously and at fixed intervals,
providing the name, position, dimensions, type, characteristics and status
from or concerning an AtoN.
Another type of AIS VHF radio-based AtoN is the Mobile Aid to Navigation
(MAtoN), which is defined as a non-fixed or un-moored Aid to Navigation
(AtoN) and does not include a fixed or moored buoy that is adrift from sta-
tion, temporarily or otherwise.29 MAtoN may exist in physical or virtual
form and can be used for Ocean Data Acquisition System (ODAS) (e.g., to
gather data on currents and weather), wreckage (e.g., containers, debris),
water quality and pollution monitoring equipment, dynamic guard zones and
convoys, underwater operations, enhancing navigational safety during mili-
tary operations (e.g., no sail zones during minesweeping, target, exercises
areas), towed and deployed applications (e.g., cable laying), search & rescue
applications and special events (e.g., swimming competitions). It is specific-
ally noted that MAtoN should not be used for unmanned vessel applications.
Today’s Currents
Speed: 0.34 kts
Direction: 35° (toward NE)
Currents measured at: 22.0 ft below the surface
Plotted data in both standard and metric units over a daily period of
between 0000 and 2359 is also available as desired.
Navigation Sensors 49
and is intended to meet present and future user needs through harmonization
of marine navigation systems and supporting shore services. The development
and implementation of the concept is coordinated by the IMO as a global
collaborative effort to enhance maritime safety and efficiency by using elec-
tronic navigation. The strategy for e-Navigation implementation involves the
integration and use of information from various sources, including MF, HF
and VHF radio, GMDSS, Inmarsat, AIS, DGPS, Long-Range Identification
and Tracking (LRIT) and Enhanced Maritime Mobile Service (EMMS).35
Many technologies and services comprising e-Navigation have been dis-
cussed elsewhere in this chapter and in Chapter 7, Shore and Intership
Communications.
References
1 Chip log. Bright Hub Engineering. Albany, NY, USA. www.brighthubengineer
ing.com/seafaring/60582-what-is-a-chip-log/
2 Mariners Astrolabe. Mariners’ Museum & Park. Newport News, VA, USA.
https://exploration.marinersmuseum.org/object/astrolabe/
3 Backstaff. Mariners’ Museum & Park. Newport News, VA, USA. https://expl
oration.marinersmuseum.org/object/back-staff/
4 Using Lead Lines to Collect Hydrographic Data. National Oceanographic and
Atmospheric Administration (NOAA), USA. https://celebrating200years.noaa.
gov/transformations/hydrography/side.html
5 Bowditch. The American Practical Navigator. Chapter 1, The Art and Science
of Navigation. Pg. 1. Defense Mapping Agency Hydrographic/Topographic
Center. Bethesda, MD, USA. Pub. No. 9. 1995 Edition.
6 International Rules for the Prevention of Collisions at Sea, 1972 (COLREGS)
as Codified in the United States by Coast Guard Commandant Instruction
M16672.2D Navigation Rules –International and Inland. Rule 5, Look-out.
25 March 1995.
7 R. Glenn Wright. Innovations in Electronic Communications and Navigation.
Unmanned and Autonomous Ships. Routledge, Taylor & Francis Group.
Table 1.1, p. 7. 2020. ISBN: 978-1-138-12488-6.
8 NIST GNSS. Time and Frequency from A to Z. National Institute of Standards
and Technology (NIST), Physical Measurement Laboratory, Time and
50 Navigation Sensors
Cargo Sensors
Doi: 10.1201/9781003377900-4
Cargo Sensors 53
decks and can carry heavy rolling and non-containerized cargo, breakbulk
and cars.4
General cargo ships or multipurpose vessels are flexible in their use and
can transport a wider variety of goods and cargos including containers,
bulk cargo and Ro-Ro vehicles. There are many ships in this category, and
the size leader by individual ship changes frequently. However, in 2019,
the ten largest operators by deadweight of multipurpose-project-heavy-lift
tonnage deployed a combined fleet of 476 ships with a total deadweight of
8,220,000 million tons and an aggregate lifting capability of 155,000 tons.5
Other vessels that transport specialized cargo include ships that transport
other smaller ships, refrigerated ships for transporting fish cargo, livestock
carriers, ships that transport large aircraft components such as fuselage sec-
tions, crane ships that operate at sea, open deck ships that transport giant
cranes for port use, ships that recover space rockets after launch and trans-
port them to port, and just about most anything else that can be moved from
one place to another.
to detect differences in air pressure between the outside and inside of the
container hold, where the same pressure may indicate open and differences
in pressure indicate closed. Magnetic sensors may also be distributed along
the hatch and frame to detect gaps and determine if the hatch is open or ajar.
Ultrasonic leak detectors may also detect changes in airflow or pressure that
may represent air or gas leaks around the hatch covers.
Additional sensors are also required in container holds to detect and sup-
press fire.9 These include temperature, smoke, and heat sensors to detect
indications that a fire may be present. Also, sensors are required to monitor
oxygen concentration within the space as part of a carbon dioxide (CO2) fire
suppression system to verify and maintain an inert atmosphere during and
after a system discharge.
The containers collect and transmit data about their location, condition and
cargo, which can be used to optimize and improve the efficiency of container
routing. These data can also enhance security with sensors that can detect
tampering, unauthorized access, theft of contents, and cargo damage due to
shock; and promote sustainability of the shipping supply chain by tracking
the progress of the shipment and to identify potential delays. This is especially
needed due to the increasing number of containers lost at sea with resulting
pollution and their widespread dispersal across the seas as large hazards to
navigation. Smart shipping containers equipped with sensors, GNSS tracking
and other Internet of Things (IoT) devices are now replacing the simple steel
containers and discrete sensors of yesterday.
4.3.2 Explosion
An explosion took place, and four crew members were injured on board the
120,600-dwt bulk carrier CSSC Cape Town as it entered British Gibraltar
Territorial Waters in the Bay of Gibraltar on 19 February 2021.28 The ship,
built in 2020, was loaded with 112,365 metric tons of coal at the Curtis Bay
Coal Terminal in Baltimore, Maryland, USA. According to the report, the
explosion appeared to be in the area of the vessel’s forecastle, and the cause
of the explosion is not known. No fire was reported.
Coal is classed as a non- dangerous cargo under the Convention on
Facilitation of International Maritime Traffic. However, coal dust suspended
in the air can be highly explosive and very susceptible to spontaneous com-
bustion because it has much more surface area per unit weight than lump
coal. For an explosion to occur, five simultaneous elements must be pre-
sent: fuel, heat, oxygen, suspension and confinement.29 A fire may occur
should fuel, heat, oxygen and confinement conditions exist in proper quan-
tities. However, an explosion could occur with the suspension of burning fuel
as may result from the introduction of a sudden blast of air.
The primary sensors for determining coal dust concentrations are cumber-
some and involve frequent maintenance and replacement of filters. The dust
can easily block and pollute photodiodes of light-scattering dust concentra-
tion optical sensors. New research in electrostatic induction coal dust con-
centration sensors has yielded high detection accuracy for coal dust and may
be suitable for future use in bulk cargo ships.30
Cargo Sensors 59
The rupture of the styrene monomer tank resulted from a runaway poly-
merisation that was initiated by elevated temperatures caused by heat
transfer from other chemical cargoes. The elevated temperatures caused
the inhibitor, added to prevent the chemical’s polymerisation during the
voyage, to deplete more rapidly than expected. Although the styrene
monomer had not been stowed directly adjacent to heated cargo, the
potential for heat transfer through intermediate tanks was not fully appre-
ciated or assessed. Critical temperature limits had been reached before the
vessel berthed under the road bridge in Ulsan. The tanker’s crew did not
monitor the temperature of the styrene monomer during the voyage, and
therefore were not aware of the increasingly dangerous situation.
This accident indicates a need for the use of one or more smart sensors cap-
able of performing the task supposed to be accomplished by the crew to
monitor the temperature of the styrene monomer and the environment in
the immediate vicinity and to promptly analyze and report the findings to
crewmembers. Several environmental characteristics, such as temperature,
trace gases and other phenomena appropriate to styrene monomer, could
be monitored simultaneously using multi-sensor elements within one instru-
ment enclosure. In addition to monitoring specific environmental character-
istics, integrating machine learning and deep-learning artificial intelligence
(AI) technologies into the sensors can discern interrelationships between the
characteristics not readily detectable through human senses or understand-
ing by examining data continuously over periods to detect evidence of trends
and potential pending events. Such capability is appropriate in preventing
this accident since the cargo paperwork called for maintaining tank tempera-
ture below 30°C. The tank reached 50° C three days before the event, 65°
C the day before, then rapidly passed 90° C shortly before the explosion.
The report identifies a similar spontaneous heating incident in August 2019
affecting a styrene cargo aboard the tanker Stolt Focus. In this instance, the
crew added seawater to the cargo, then diluted it with benzene and added an
inhibitor to bring the situation under control.
Marshall Islands-registered company that owns the ship does not own any
other ships, and no trace of insurance could be found. This highlights the
risk of a growing dark fleet of aging vessels transporting sanctioned oil as the
result of purchases of hundreds of old tankers by undisclosed buyers.
The cause of the fire is unknown, but since it appears to have been nearly
empty, there is a great possibility that vapors from the remains of the oil
cargo played a part in this accident. However, the conditions leading up to
this accident likely relate directly to the use of old ships subject to the most
pervasive types of structural problems of corrosion and fatigue cracking
known to be major threats to the structural integrity of aging vessels, espe-
cially tanker structures and bulk carriers that operate beyond their design
service life.36 While modern sensor technology may have detected concentra-
tions of explosive vapors before the explosion, their installation and use on
an old crude oil tanker is not very probable nor likely to make them safer.
• Grande Costa d’Avoria when in July 2023, while completing used vehicle
loading operations in Newark, New Jersey, USA, a fire started on deck
number ten and burned for more than six days.41
• Höegh Xiamen burned for eight days in June 2020, started by a fire caused
by an electrical fault from an improperly disconnected battery in a used
vehicle while in the harbor in Jacksonville, Florida, USA.42
Cargo Sensors 63
• Grande America when in March 2019, it burned and sank in the Bay
of Biscay off France when a container that was part of the cargo caught
fire.43
The list of names continues with Euroferry Olympia (2022), Felicity Ace
(2022), Eurocargo Trieste (2019), Grande Europa (2019), Golden Ray
(2019: capsized) and several others.
Conventional smoke and heat fire detectors require high threshold lev-
els to trigger a warning and are not sensitive or accurate enough to detect
slight thermal changes. Smoke may be rerouted away from smoke detectors
by onboard extractor fans, making the detection harder.42 At present, cargo
holds are routinely inspected by crewmembers using predefined walkways
between the vehicles to supplement fire detection systems. This is a lengthy
and manual process and not always effective, especially when fire risks are
not always visible to the human eye. Using infrared cameras to detect heat
signatures may help in this endeavor.
One effort currently underway to help resolve these problems is the
Swedish Legislative Assessment for Safety Hazards of Fire and Innovations
in Ro-Ro Ship Environment (Lash Fire). This initiative is intended to inves-
tigate these problems and provide a recognized technical basis for revising
international IMO regulations to greatly enhance fire prevention and ensure
independent management of fires on Ro-Ro ships in current and future fire
safety challenges.44
• The smoke detectors, although operational, failed to alert the crew of the
existence of a fire immediately as they probably had been silenced for a
short period of time while the vessel was in port.
• Combustible materials in the form of leaked fuel, leaked oil, braided PVC
pipes (to direct the leaks), plastic containers to collect drained oils, oil in
the bilges and the vicinity contributed to the propagation of the fire.
• It is highly likely that the leak in the CO2 system compromised its
effectiveness.
• Evidence indicated that the doors to the fuel oil modules and separator
rooms were open.
• The delay in stopping the port main engine is likely to have contributed
to the fire taking hold.
The decision-making process of the master would have been very complex,
involving at least cues (possibly conflicting), technological data, informa-
tion from fellow crew members, interpretation of that data and a decision
to act, either in one way or another.
This finding supports an argument for using automation and smart sensors
to aid decision-making.
This led to his incorrect determination of the vessel’s stability and resulted
in Golden Ray having an insufficient righting arm to counteract the forces
developed during a turn. The ship caught fire and was ultimately scrapped
in place at the accident site. The NTSB also found that two watertight doors
had been left open for almost 2 hours before the accident and that no one on
the bridge ensured the doors were closed before departure. Nothing appeared
amiss until the ship began to heel rapidly to port during a 68-degree turn to
starboard.
Although human error is the cause of this accident, there were no backup
procedures in place to identify and correct the error. Smart sensors trained
in the behavior patterns of the vessel may have detected subtle differences
between actual vessel weight, mass, momentum and acceleration forces from
nominal before initiating the turn —but maybe not. Indeed, had simple sen-
sors been in place to detect the open doors, the vessel may not have flooded
nearly as rapidly.
References
1 Zahra Ahmed, Top 20 World’s Largest Container Ships in 2023. Marine
Insight. 11 April 2023. www.marineinsight.com/know-more/top-10-worlds-
largest-container-ships-in-2019/
2 Rishab Joshi. Top 5 Biggest Bulk Carriers in the World. Marine Insight. 6 June
2022. www.marineinsight.com/types-of-ships/biggest-bulk-carriers/
3 TMP Staff. Top 10 Largest Tanker Ships in the World. The Maritime Post.
3 February 2022. https://themaritimepost.com/2022/02/top-10-largest-tan
ker-ships-in-the-world/
4 Saurabh Sinha. Top 10 Biggest Ro-Ro Ships in the World. 14 March 2022. www.
marineinsight.com/types-of-ships/top-10-biggest-roro-ships-in-the-world/
5 Breakbulk’s Top Ten. The Maritime Executive. 10 April 2019. www.maritime-
executive.com/article/breakbulk-s-top-ten
6 ECIB. ONE Apus Container Loss: Then and Now. Expeditors Cargo Insurance
Brokers 28 January 2021. https://insider.ecibglobal.com/blogs/one-apus-contai
ner-loss-then-and-now
Cargo Sensors 67
22 Paul Bartlett. Cargo Liquefaction Greatest Cause of Deaths in Dry Bulk Sector.
31 July 2023. www.seatrade-maritime.com/dry-bulk/cargo-liquefaction-great
est-cause-deaths-dry-bulk-sector
23 Susan Gourvenec. The Cargo Ships That ‘Liquefy’. BBC. 16 September 2018.
www.bbc.com/future/article/20180905-the-cargo-ships-that-liquefy
24 North P&I Club. The Dangers of Cargo Liquefaction in a Nutshell. As
described in ShipNerd. 30 May 2022. www.shipnerdnews.com/the-dangers-of-
cargo-liquefaction-in-a-nutshell/
25 IMO: Bauxite Liquefaction Sank Bulk Jupiter. Offshore Energy Today. 21
September 2015. www.offshore-energy.biz/imo-bauxite-liquefaction-sank-
bulk-jupiter/
26 Gard P&I Club. Cargo Liquefaction. 27 September 2023. www.gard.no/web/
topics/article/20651747/cargo-liquefaction
27 Mike Schuler. Gard Warns of Liquefaction Risks with Unlisted Cargoes. gCap-
tain. 13 July 2023. https://gcaptain.com/gard-warns-of-liquefaction-risks-of-
unlisted-cargoes/
28 TMP Staff. Explosion Aboard Chinese Bulker CSSC Cape Town in Gibraltar
Bay Leaves Four Injured. The Maritime Post. 22 February 2021. https://them
aritimepost.com/2021/02/explosion-aboard-chinese-bulker-cssc-cape-town-in-
gibraltar-bay-leaves-four-injured/
29 Clete R. Stephan, P.E. Coal Dust Explosion Hazards. Mine Safety and Health
Administration. Pittsburgh, PA, USA. https://ncsp.tamu.edu/reports/MSHA/
coaldust.pdf
30 Jiange Chen, Dewen Li, Guoqing Liu, Yanzhu Li, Anran Zhang, Siyuan Lu,
and Mi Zhou. Development of a Coal Dust Concentration Sensor Based on
the Electrostatic Induction Method. American Chemical Society. ACS Omega
2023, 8, 14, 13059–13067. 28 March 2023. https://doi.org/10.1021/acsom
ega.3c00319
31 Navy Releases Extensive Bonhomme Richard Fire Report, Major Fires Review.
Vice Chief of Naval Operations Public Affairs. 20 October 2021. www.navy.
mil/Press-Offi ce/News-Stories/Article/2816283/navy-releases-extensive-bonho
mme-richard-fire-report-major-fires-review/
32 USNI. Navy Investigation into USS Bonhomme Richard Fire, Major Fires
Review. US Naval Institute. 20 October 2021. https://news.usni.org/2021/10/
20/navy-investigation-into-uss-bonhomme-richard-fire-major-fires-review
33 UK Marine Accident Investigation Branch (MAIB) inquiry into the 2019 blast
aboard the chemical tanker Stolt Groenland. Report 9/2021. Southampton.
Synopsis. July 2021. https://assets.publishing.service.gov.uk/media/60f93e2cd
3bf7f044c51590b/2021-09-StoltGroenland-Report.pdf
34 Heather Chen, Irene Nasser and Teele Rebane. Oil Tanker Catches Fire Off
Malaysian Coast, Three Crew Missing. CNN. Mon, 1 May 2023. www.cnn.
com/2023/05/01/asia/malaysia-coast-oil-tanker-fire-rescue-intl-hnk/index.html
35 Alex Longley and Yongchang Chin. An Oil Tanker Ablaze in the South China
Sea Is a Global Problem. Bloomberg. 6 May 2023. www.bloomberg.com/news/
articles/2023-05-07/an-oil-tanker-ablaze-in-the-south-china-sea-is-a-global-
problem
Cargo Sensors 69
36 Unyime O. Akpan, T.S. Koko, B. Ayyub and T.E. Dunbar. (2002) Risk assess-
ment of aging ship hull structures in the presence of corrosion and fatigue.
Marine Structures, 15(3), 211–231. ISSN 0951-8339. https://doi.org/10.1016/
S0951-8339(01)00030-2
37 TBS. Explosion on Oil Tanker in Jhalakathi Leaves Five Burnt, Four Missing.
The Business Standard. 1 July 2023. www.tbsnews.net/bangladesh/oil-tanker-
catches-fire-after-explosion-jhalakathi-5-burnt-4-missing-658470
38 Marine Insight. Oil Tanker Explodes in Thai Waters. Marine Insight News
Network. 18 January 2023. www.marineinsight.com/videos/video-oil-tanker-
explodes-in-thai-waters-blows-up-workers-leg-500-m-away/
39 Franz Evegren. Fire in Vehicles Onboard Ships. RISE Fire Safe Transport.
Research Institutes of Sweden. www.ri.se/sites/default/files/2020-12/FIVE_Fire
in vehicles onboard ships_Evegren_WEB_201215.pdf
40 Safetytech. Startup Deploys Wireless Sensors across Ship’s Cargo Hold to
Predict Fire. Safetytech Accelerator, London, UK. 3 March 2020. https://safety
techaccelerator.org/case-studies/safetytech-startup-deploys-wireless-sensors-acr
oss-ships-cargo-hold-to-predict-fire/
41 Mikhail Voytenko. Grimaldi’s Ro-Ro Major Fire at Newark, 2 Firefighters
Died. FleetMon. 6 July 2023. www.fleetmon.com/maritime-news/2023/42341/
grimaldis-ro-ro-major-fire-newark-2-firefi ghters-d/
42 NTSB. Failure to Properly Disconnect and Secure Vehicle Batteries Led to Fire
Aboard Vehicle Carrier Höegh Xiamen. US National Transportation Safety
Board. 16 December 2021. www.ntsb.gov/news/press-releases/Pages/mr20211
216.aspx
43 FreightWaves Staff. Grimaldi Confirm Grande America Fire Started in
Container Cargo. FreightWaves. www.freightwaves.com/news/maritime/grima
ldi-confirms-grande-america-fire-started-in-cargo-container-ezkx7-g33nr-
gn3cf
44 Legislative Assessment for Safety Hazards of Fire and Innovations in Ro-Ro
Ship Environment (Lash Fire). www.lashfire.eu
45 Chris Teague. How Much Should You Worry About EV Fires? Autoweek
News. October 2022. www.autoweek.com/news/a38225037/how-much-you-
should-worry-about-ev-fires/
46 Traffic Safety Facts Annual Report Tables. National Highway Traffic Safety
Administration. U.S. Department of Transportation. https://cdan.dot.gov/
tsftables/tsfar.htm#
47 NTSB. Safety Risks to Emergency Responders from Lithium-Ion Battery Fires
in Electric Vehicles. U.S. National Transportation Safety Board. www.ntsb.gov/
safety/safety-studies/Pages/HWY19SP002.aspx
48 National Transportation Safety Board. Fire aboard Tank Vessel S-Trust, Report
MIR-23-23. 25 October 2023. www.ntsb.gov/investigations/AccidentReports/
Reports/MIR2323.pdf
49 J. Wikman, F. Evegren, M. Rahm, J. Leroux, A. Bruillard., M. Kjellberg, L.
Gustin, and F. Efraimsson. FIRESAFE: Study Investigating Cost Effective
Measures for Reducing the Risk from Fires on Ro- Ro Passenger Ships.
Lisbon: European Maritime Safety Agency. 2016.
70 Cargo Sensors
50 Safety Investigation into the Engine-Room Fire on Board the Maltese Registered
Ro-Ro Cargo Vessel Eurocargo Trieste. Transport Malta. Marine Safety
Investigation Unit. Marine Safety Investigation Report No. 21/2020. Final.
www.iims.org.uk/wp-content/uploads/2020/11/Transport-Malta-MV-Euroca
rgo-Trieste-Safety-Investigation-Report.pdf
51 NTSB. NTSB Determines Inaccurate Stability Calculations Caused Capsizing
of Vehicle Carrier Golden Ray. U.S. National Transportation Safety Board. 14
September 2021. www.ntsb.gov/news/press-releases/Pages/NR20210914b.aspx
Chapter 5
Probably one of the most significant areas in which advances in sensor tech-
nology have taken place is the result of increasing interest by the shipping
industry in the instrumentation of crew and passenger spaces with sensors
that not only identify personnel but also track and monitor their movements
and well-being while in port and throughout their voyages. This is being
accomplished due in part to digitalization in maritime shipping through Port
Community Systems (PCSs) and other means to connect different govern-
ment, public and private organizations for the secure and intelligent exchange
of information.
Doi: 10.1201/9781003377900-5
72 Crewmember and Passenger Sensors
of its most important uses is the monitoring of crowded areas like swimming
pool decks, dining halls, theaters, or entertainment venues to observe crowd
density, detect potential issues before they unfold or help stem their effects,
and ensure the smooth flow of passengers, especially during emergencies or
evacuation. Many such cameras also have embedded microphones capable of
recording human conversations and other sounds in the immediate vicinity
and allowing two-way conversations to be carried on through the camera.
Introducing smart video cameras into the ship environment with motion
detection and embedded intelligence through machine learning and deep-
learning AI provides even more capabilities than traditional video cameras.4
These capabilities include processing images using facial and object recog-
nition techniques to identify specific people, objects and human activities
such as walking, talking and running. Video cameras incorporating thermal
imaging capabilities can also distinguish healthy crewmembers and passen-
gers from those exhibiting fevers and possibly suffering from communicable
diseases. Video footage may also be stored in the Cloud, allowing company
officials and other users to access it from anywhere.
immediate alarm can be raised identifying the intruder and location at which
the intrusion took place so that corrective action may be taken. Further use
can be made to track movement and activity around a ship to identify loca-
tions visited and the times visits occurred, which is especially useful for deter-
mining the popularity of cruise ship venues and for tracking an individual’s
movements while investigating missing person reports on ships.
The accuracy of facial recognition has improved dramatically in recent
years. According to data from the U.S. National Institute of Standards and
Technology (NIST), the top 150 algorithms are over 99% accurate across
black male, white male, black female and white female demographics.7 For
the top 20 algorithms, the accuracy of the highest-performing demographic
versus the lowest varies between 99.7% and 99.8%. Unexpectedly, white
male is the lowest-performing of the four demographic groups for the top 20
algorithms. For 17 of these algorithms, accuracy for a white female, black
male and black female are nearly identical at 99.8%, while they are least
accurate for the white male demographic at 99.7%. Detailed statistics for
other demographics were not easily available. Facial recognition may be sty-
mied by face masks and plastic surgery. However, this technology appears
resilient and can still discern unique patterns sufficient to positively identify
people using partial patterns.
the information they contain, but this vulnerability can be reduced through
data encryption. They can also be more expensive than traditional ID cards.
RFID operating frequencies cover three different bands: Low Frequencies
(LF), ranging from 30 to 300 kHz; High Frequency (HF), ranging from 3 to 30
MHz, with 13.56 MHz as the dominant frequency; and Ultrahigh Frequency
(UHF), ranging from 300 MHz to 3 GHz.9 The transmission range for pas-
sive devices is typically one to five meters or more. Actual ranges will vary
depending on the specific tag, reader and environment.
The cost of passive RFID technology continues to decrease, and their
utility is increasing with greater memory capacity and two-way read/write
capabilities, tending more toward multisurface UHF systems. One study in
the proper application of this technology demonstrated the test results of
various multisurface UHF systems from different manufacturers for their
readability under varying conditions, such as orientation with respect to the
reader, distance from the reader and materials used for embedding them.10
Of ten different products tested, none scored 90% or higher for either read-
ability or precision, so there is ample opportunity to continue enhancing this
technology. Antenna orientation and distance to the reader appear to be the
greatest contributors to error and inaccuracy. However, some tags seemed to
place well on every metric given.
During the period between 2010 and 2020, many enhancements to RFID
technology have taken place to improve their performance and accuracy.11
This includes increased reader sensitivity and read rate, increased transmit
power to enhance range, improved processing power with some readers hav-
ing built-in microprocessors, better antenna designs and additional connect-
ivity options for transferring data.
5.7 Fingerprints
Fingerprints are also used to verify identity because they are unique to each
individual, and this method is one of the earliest forms of biometric sens-
ing for identifying people. No two people have the same fingerprints, not
even identical twins. This makes fingerprints a very reliable way to identify
someone.
Multiple types of sensors may be used for fingerprint identification,
including optical, capacitive and ultrasonic scanners. Once scanned, a finger-
print is converted into a digital mathematical model representing a unique
identifier for that individual and then stored in a database. Ultrasonic sensors
78 Crewmember and Passenger Sensors
create a 3D fingerprint image that is more secure and accurate than trad-
itional capacitive fingerprint readers. Fingerprint identification uses pattern
recognition techniques similar to facial recognition and eye scanning.
The U.S. National Institute of Standards and Technology (NIST) has
evaluated several commercially available contactless fingerprint scanning
technologies, allowing users to compare their performance to conventional
devices that require physical contact between a person’s fingers and the scan-
ner.16,17 The study results show devices requiring physical contact remain
superior to contactless technology at matching scanned prints to images in a
database. However, when contactless devices scan multiple fingers on a hand,
it improves their performance. Contactless devices that scanned multiple fin-
gers also seldom made “false positive” errors that incorrectly matched one
person’s print with another’s record.
Other factors that may make fingerprint matching less reliable include dam-
age to the finger caused by cuts, burns, or other injuries that have occurred
since the baseline database model was established, which may reduce or
invalidate potential matches. This may also occur should dirt, grease, or other
materials obscure the fingerprint; if it is smudged, blurry, or distorted in any
way, or if combined with or overlaps fingerprints from other individuals.
increase in pressure that occurs when a person falls. Also included are inertial
measurement units (IMUs) that use sensor fusion by combining measure-
ments made by accelerometers, gyroscopes and other sensors to determine
the overall motion of a moving object with behavioral characteristics that
may indicate a slip or fall has occurred.
References
1 Molly Bohannon. The ‘Icon of the Seas’ Will Soon Be the World’s Largest
Cruise Ship. Forbes Magazine. 12 July 2023. www.forbes.com/sites/mollyb
ohannon/2023/07/12/the-icon-of-the-seas-will-soon-be-the-worlds-largest-cru
ise-ship---heres-how-much-it-costs-to-get-aboard/
2 The use of CCTV Cameras on Ships. Monarch Group. 27 November 2019.
www.monarchglobal.net/post/the-use-of-cctv-cameras-on-ships
3 Staff. More Shipowners Use Video Cameras for Security, Safety and Operations.
Professional Mariner. 27 May 2016. https://professionalmariner.com/more-shi
powners-use-video-cameras-for-security-safety-and-operations/
4 How Do Smart Cameras Work? Technology Org. 18 July 2019. www.technol
ogy.org/2019/07/18/how-do-smart-cameras-work/
5 What Is Facial Recognition? How Facial Recognition Works. Internet of
Things. Norton. 20 August 2021. https://us.norton.com/blog/iot/how-facial-
recognition-software-works
6 What Is Facial Recognition –The 2023 Ultimate Guide for Facial Recognition
Technology Guides & Tips. FaceMe. 14 March 2023. www.cyberlink.com/fac
eme/insights/articles/204/Facial-Recognition-at-the-Edge-The-Ultimate-Guide
7 Patrick Grother, Mei Ngan, Kayee Hanaoka, Joyce C. Yang, and Austin Hom.
Ongoing Face Recognition Vendor Test (FRVT). Information Technology
Laboratory, National Institute of Science and Technology (NIST), US
Department of Commerce. 16 June 2023. www.nist.gov/programs-projects/
face-recognition-vendor-test-frvt-ongoing
8 The Use of RFID for Human Identification. Draft Report. U.S. Department
of Homeland Security (DHS), Emerging Applications and Technology
Subcommittee. Version 1.0. www.dhs.gov/xlibrary/assets/privacy/privacy_ad
vcom_rpt_rfid_draft.pdf
9 ACD. Understanding RFID and RFID Operating Ranges. Advanced Controls
and Distribution. 27 March 2017. https://blog.acdist.com/understanding-rfid-
and-rfid-operating-ranges
10 Aldo Minardo, Joshua Bolton, Erick Jones, Raghavendra Kumar Punugu,
Ankan Addy, and Samuel Okate. Performance and benchmarking of multisur-
face UHF RFID tags for readability and reliability. Journal of Sensors. Hindawi.
05 September 2017. https://doi.org/10.1155/2017/3467593
11 Suzanne Smiley. RFID Failed You in the Past? It May Have Improved
More Than You Think. 19 May 2020. www.atlasrfi dstore.com/rfid-insider/
why-try-rfid-again/
12 Alexa Saul. Retinal Security Scans: How Accurate Are They? Arizona Retina
Project. 31 October 2018. https://azretina.sites.arizona.edu/index.php/
node/379
13 Mary Clark. Iris Recognition Scanners vs. Fingerprint Scanners: Compare and
Contrast. Bayometric. www.bayometric.com/iris-recognition-scanners-vs-fing
erprint-scanners/
14 What Are Iris and Retina Scanners, and How Do They Work? RecFaces. https://
recfaces.com/articles/iris-scanner
82 Crewmember and Passenger Sensors
15 David Turbert. What Parts of the Eye Can Be Transplanted? 13 January 2022.
American Association of Ophthalmometry. www.aao.org/eye-health/treatme
nts/transplantation-eye
16 NIST Study Measures Performance Accuracy of Contactless Fingerprinting
Tech. 19 May 2020. www.nist.gov/news-events/news/2020/05/nist-study-
measures-performance-accuracy-contactless-fingerprinting-tech
17 John Libert, John Grantham, Bruce Bandini, Kenneth Ko, Shahram Orandi,
and Craig Watson. Interoperability Assessment 2019: Contactless-to-Contact
Fingerprint Capture. NISTIR 8307. National Institute of Standards and
Technology. https://doi.org/10.6028/NIST.IR.8307
18 Maritime Slip and Fall/Trip and Fall Injuries. Maritime Injury Guide. www.
maritimeinjuryguide.org/maritime-accidents-injuries/maritime-bodily-injuries/
slip-fall-trip-fall/
19 Facts About Falls. U.S. Centers for Disease Control. www.cdc.gov/falls/
facts.html
20 Fox Morgan. Best Personal Locator Beacons and AIS Units. Yachting World
Magazine. 18 July 2023. www.yachtingworld.com/yachts-and-gear/best-perso
nal-locator-beacons-and-ais-units-top-options-for-boating-137237
21 U.S. Cruise Vessel Security and Safety Act of 2010. Public Law 111–207. 27
July 2010, 124 Stat. 2243. 111th Congress. www.congress.gov/111/plaws/publ
207/PLAW-111publ207.pdf
22 ISO 21195:2020. Ships and Marine Technology –Systems for the Detection
of Persons While Going Overboard from Ships (Man Overboard Detection).
www.iso.org/standard/76051.html
23 Elliot Gardner. Setting International Standards for Man Overboard Systems.
Ship Technology. 27 May 2018. www.ship-technology.com/analysis/setting-
international-standards-man-overboard-systems/
Chapter 6
Throughout the proceeding chapters the use of artificial intelligence (AI) has
featured prominently in discussions on smart sensors and sensor data analyt-
ics to detect objects, trends and events of human interest with respect to the
roles in which seafarers are engaged. This includes the jobs performed by
members of the engineering and deck departments in operating and guiding
the ship to ensure cargo is loaded, transported and delivered to its destination
safely and efficiently; and the hospitality department to safeguard the com-
fort and well-being of passengers. Critical to this concept is how the various
systems and sensors of the ship itself can aid and enhance seafarer situational
awareness to improve job and ship safety, performance and effectiveness. AI
can provide significant advantages towards achieving this goal, but the tech-
nology is not without controversy.
Some argue that AI is the panacea to overcoming human frailties by even-
tually replacing highly trained and experienced seafarers with automatons
that perform their jobs flawlessly, never get bored or need to take breaks or
sleep, can work around the clock and do not need to be paid.1,2,3 Supporting
this position are statistics that human error is estimated to be responsible for
between 76% and 94% of marine casualties.4,5 Others think that AI brings
the risk of humans losing control to the automaton within the machinery it
operates; that external forces can use AI to infiltrate, manipulate, disable and
even hijack shipboard systems; and that AI can become superintelligent and
capable of malicious behavior to cause harm to those onboard and imperil
the ship.6,7 The benefits of using AI to alert engineering and navigation
watchstanders to the development of significant events, conditions, trends
and situations can and have been repeatedly demonstrated in enhancing effi-
cient ship routing and fuel savings, collision avoidance, reducing workload,
predicting failures and maintenance actions, and detecting objects and people
in the water.8,9 However, the lack of a clear understanding as to exactly how
AI systems (especially deep-learning AI) achieve the conclusions they reach,
inherent biases in the methods and datasets used to train them, the lack of
metrics to verify their development processes and validate their performance,
Doi: 10.1201/9781003377900-6
84 Artificial Intelligence in Sensor Systems
and the uncanny ability of hackers to penetrate all levels of security precau-
tions lends credence to doomsayer theories.
Many definitions of the term “artificial intelligence” refer to the ability of
a machine to simulate, replicate, or even improve upon intelligent behavior
exhibited by various forms of life that range from fruit flies to human beings.
Since we often fail miserably in discerning how human beings or even insects
think and reach conclusions, the best we can do is to compare the results
achieved by an AI-based solution against human results for a given, spe-
cific and well-defined task. Apparently, AI can be very good at what it does.
Spectacular results have been achieved for many years now by AI gaming
applications in outwitting human opponents. ChatGPT has successfully
passed academic and qualification examinations for a variety of occupations
that include medicine, law, accounting and sommelier.10,11 Internet searches
thus far have not yet revealed published records on the success of AI in pass-
ing merchant mariner examinations.
Despite the successes of AI cited in the previous paragraph, there is no assur-
ance that such achievements can be translated from an academic environment
directly into practical, reliable and widespread use in the real maritime world
in much the same manner that a fresh maritime academy graduate is not
yet qualified nor sufficiently experienced to stand watch let alone command
a ship. However, much research and experimentation have advanced ship
autonomy and navigating unmanned voyages along predetermined routes.
As applied to ship sensors, there is no well-defined general methodology to
ensure AI-based software has been developed correctly or will properly and
adequately perform its intended function. Each situation is evaluated on its
own merits based on outcomes rather than general and repeatable scientific
methods. The question is, is that enough?
This chapter sheds light on how AI may be utilized in sensor-based ship-
board systems, what constitutes AI and how these technologies are developed.
Benefits of their use and precautions that should be taken when developing,
installing and implementing these systems are also discussed. Finally, asser-
tions are made on how these systems may affect seafarers in helping to per-
form their jobs and enhance safety, how crewmember jobs may change or
be eliminated and how seafarer training practices may better adapt to future
situations.
are not necessarily perceived nor detected by humans which can help pre-
dict equipment failures. A traditional sensor samples the physical environ-
ment and then passes the measurement to a system or operator for analysis
and action. However, a smart sensor samples the physical environment and,
upon detecting specific inputs, uses built-in computer resources to perform
predefined functions and then process and analyze the data before passing
the data and analytical results to a user or an automated process.12 The
results may alert appropriate crewmembers in sufficient time for them to
initiate recommended and appropriate actions, and some can even perform
diagnostics and restorative actions on their own. The aggregation of smart
sensor data from multiple systems can greatly contribute toward improving
overall vessel efficiency and performance. Such capabilities have changed
the face of engineering departments such that most engine rooms and many
related functions are completely automated. In-person attendance is gen-
erally required only on a sporadic basis and when alarms and exceptional
situations occur.
New generations of smart sensors can make measurements with greater
precision and at higher resolutions. They are fortified with embedded signal
processing capabilities that greatly reduce noise and signal loss, with micro-
processors that analyze data characteristics at the sensor itself and then
communicate their results and enable adjustments to be made to improve
the efficiency and performance of the monitored systems. Sensors used in
AI-based navigation systems can analyze historical and real-time data that
include weather conditions, sea currents and traffic patterns to help optimize
ship routes. The fusion of multiple sensor data such as Radar, Sonar, visible
light and infrared cameras can help detect, identify and track close-by objects
in the ship’s vicinity that pose hazards, including other vessels, off-station
buoys or floating debris for early detection and identification of potential
collisions or obstacles allowing the ship’s crew to take appropriate actions.
Shipboard sensors and sensors contained within shipping containers can iden-
tify sea states that increase their chances of being lost overboard, and sensors
contained within bulk cargo can help determine whether cargo liquefaction
is likely to occur that may destabilize ships. On the nefarious side, unsoli-
cited and unexpected deliveries of smart watches containing smart sensors
to many U.S. sailors and servicemen (and possibly merchant mariners), when
turned on, auto-connect to Wi-Fi and cell phones. Without being prompted,
access is gained to many user data, cameras, microphones and tracking infor-
mation while simultaneously inserting malware into these devices.13
Benefits of smart sensors include instances where previously required dis-
crete sensor infrastructure components were installed throughout the ship;
they are now frequently encapsulated entirely within the sensor itself, result-
ing in sensor data transmission path distances reduced from meters across the
ship to mere millimeters. Shorter signal path distances result in faster data
propagation rates, decreased data processing times and reduced chances of
86 Artificial Intelligence in Sensor Systems
injecting unwanted noise and interference into the data, especially when the
entire process occurs within the shielded sensor enclosure. Accompanying the
sensor are all required signal processing elements combined with computer-
based sensor analytics and communications hardware needed to transfer
sensor-derived information and analytical results, rather than just the sensor
data itself, but also throughout the vessel itself and even worldwide to remote
monitoring and control facilities. Further enhancing smart sensor systems are
capabilities for self-calibration and integrated diagnostics that lower mainten-
ance costs and improve system reliability by reducing cabling and connector
requirements and simplifying remove and repair actions to replacement of
individual modules. Additional efficiencies are achieved by taking advantage
of distributed computing at the sensor level, enabling rapid adjustments to
processes and adapting to changing conditions as they occur. Sensor mini-
aturization also provides opportunities for including multi-sensor elements
within one enclosure, enabling the measurement of several phenomena with
one sensor assembly rather than multiple assemblies.
The greatest benefit of smart sensors is probably achieved by taking full
advantage of modern microprocessor computing power to examine sen-
sor data in new ways that provide a greater understanding of the phenom-
enon being measured. This includes the integration of machine learning and
deep-learning AI technologies into the sensors themselves that can discern
interrelationships between flows, levels, temperatures, and other meas-
ured characteristics that are not readily detectable through human senses
or understanding. This can be achieved using simultaneous sensor measure-
ments across different sensors, by examining data from one sensor over peri-
ods of time to detect evidence of trends and events contained in the data, and
with measurements from multiple sensors over time to discern even broader
relationships. Having computing power distributed among multiple sensors
can effectively achieve at minimal expense what in previous times could only
be accomplished using supercomputers at an extraordinary cost. The fol-
lowing paragraphs provide a discussion of what machine learning and deep-
learning AI technologies measure and how AI is integrated into sensors and
sensor systems.
Figure 6.1 Radar Image in Pixels, with Associated Radar Chirp Waveform Variation
over Time.
Source: R. Glenn Wright. Intelligent Autonomous Ship Navigation using Multi-
S ensor
Modalities. 12th International Conference on Marine Navigation and Safety of Sea
Transportation. June 2017. Gdynia, Poland.
88 Artificial Intelligence in Sensor Systems
that exists in various forms of imagery), frequency (data exist within and
across the frequency spectrum) and time (data that change over time) domains
defy description in natural language and where numerical and/or algorithmic
analysis is burdensome and falls short of comprehensive and complete por-
trayal of sensor target behavior and characteristics.
Many, even hundreds, of viable neural network architectures may be used
for sensor data processing and data analytics, each dedicated to solving spe-
cific types of problems and for use in specific applications. One of the more
commonly used forms is the Convolutional Neural Network (CNN), capable
of detecting numerical data, signal, and imagery features and characteristics
needed to identify particular classes, types and variations that may reflect
individual objects, trends and events. When using the supervised learning
method of training, the relevant features present within the data or imagery
are manually extracted and applied in the CNN training process. The advan-
tage of directed learning lies in the reduced time and number of datasets and
images needed for training.
Deep-learning AI is also used to train a CNN to detect sensor signal fea-
tures and characteristics to identify trends, objects and events. In contrast to
selecting specific signal and image features and characteristics of interest as is
accomplished through machine learning, with deep learning, the raw images
are fed directly into a deep neural network where the features are learned
automatically. This method generally requires hundreds of thousands to mil-
lions of images to achieve the best results, depending on the complexity of the
imagery. The advantage of deep learning is in the ability of a CNN to detect
hidden relationships present within imagery that may not be readily discern-
ible to someone and prevent their use of the directed learning method. Both
methods of neural network training can complement statistical methods for
analyzing analog, digital and complex sensor signals and detecting, identi-
fying and categorizing their features, characteristics and attributes.
Another type of deep-learning neural network featured prominently in the
news is Generative Adversarial Networks (GAN), where one neural network
trained to create imagery (GANT) is checked by another trained in the proper
boundaries of the imagery (GANC). This approach has been used successfully
in creating many anatomically correct, “deep fake” human facial images that
are highly accurate in their features yet do not represent any actual human
beings.20 Such networks can be used to supplement a variety of actual sensor
data for training purposes where the characteristics and constraints of real
sensor data are known but difficult to obtain in large volumes.
and have been correctly implemented in accordance with the overall system
design, and that interoperability of their individual components when inte-
grated into the entire system is demonstrated. Such a process will identify any
missing requirements that should be considered, unnecessary requirements,
and that the design process effectively implemented the requirements. This
topic becomes difficult with machine learning and deep-learning AI, where
unsupervised learning is used when developing the system to define its own
requirements as to what is and is not important based on the data presented
to it during training. This issue is somewhat less significant when dealing
with individual sensors and sensor systems, but becomes vitally important
when the concepts of sensor and smart sensor fusion, sensor degradation,
and cybersecurity are considered.
Validation of sensor system performance must also be accomplished to
determine that the entire system can correctly implement the requirements.
The validation process determines whether the requirements contained and
presented within training datasets were reasonable in terms of their ability to
reflect the actual functions the system must perform, and whether the design
was adequate to achieve the desired functions. Problems are likely to arise
when smart sensors act upon bias and insufficient scope of data is represented
within training datasets to achieve conclusions or solve problems in ways
that are correct, or in novel ways that are unanticipated and may or may not
be correct.
Testing of sensor system performance must also be performed to cover the
breadth and depth of system functionality to determine whether it is suffi-
ciently robust and capable of performing across all required conditions and
operational settings. This will include exercising all system functions with a
wide variety of test cases to demonstrate its capability to perform its required
functions and to ensure that, in the event of failure to perform properly, it
can do so in a manner that is not destructive and provides traceability of the
cause(s) of failure. The capability of the system to achieve interoperability
with other systems must also continue to be demonstrated throughout this
process.
A key element to testing smart sensors must consider the possibility of
achieving verification and validation of machine learning solutions using
statistical processes. However, deep-learning Al solutions are not amenable
to this approach due to their inherent lack of visibility into the functioning
of their internal processing elements. Merely implementing black box testing
techniques based upon system requirements and design is insufficient as deci-
sions are being made in real time, and it may not be possible to effectively
reproduce the same outcome by replicating the same exact stimulus recorded
by the sensors –a task that is essentially impossible to achieve in an oper-
ational environment. This is an area that is still in its infancy, and attempting
to achieve a definitive solution to the verification and validation of deep-
learning Al is beyond the scope of the present-day state of the art.
Artificial Intelligence in Sensor Systems 93
approaching at fast speed without running lights. What has actually hap-
pened here is the fusion of heterogeneous data from different sensors to com-
pare, discriminate and comprehensively analyze the acquired information
with memory or experience to obtain a more accurate, complete, and more
reliable, consistent interpretation and description of the measured phenom-
enon than is possible using a single sensor.25
Sensor data fusion is a general term that can refer to any method of com-
bining data from multiple sensors. This can include traditional sensors such
as cameras, Radar, Sonar, and Lidar; and also smart sensors that can pro-
cess data locally and make decisions on their own. Much research in this
area has been performed using traditional sensors focusing on specific vessel
applications related to equipment test, diagnosis and prognosis;26,27 gaining
fuel efficiencies by examining the ship’s speed, fuel consumption and weather
parameters;28 and improving object recognition capabilities for autonomous
vessel navigation and collision avoidance.29 The results thus far have been
promising and are providing measurable improvements in sensor system
designs and gains in efficiency and situational awareness.
However, unlike traditional sensors, smart sensors can communicate
with each other and share data, metadata and analytical results in real time,
allowing them to collaborate and make better-informed decisions that can
aid mariners and directly influence autonomous agents operating unmanned
vehicles. Further gains are needed in the design of smart sensors and sensor
systems to achieve greater interoperability and for their use to become more
commonplace. The fusion of measurement analytical results from multiple
smart sensors, not just the measurements themselves, is likely to yield game-
changing improvements over traditional sensor fusion through improve-
ments in accuracy and reliability, greater efficiency, enhanced adaptability to
challenging situations, increased situational awareness, enhanced decision-
making, reduced costs and greater environmental compliance.
perform more and more. However, crew members shouldn’t worry. While
AI and machine learning can streamline and assist with operations, they
can only automate so much. Individuals offer unique skills and capabil-
ities that machines cannot replicate, making a human presence on ships
invaluable.
This is not to say that seafarers will be unaffected by AI, and this chapter
illustrates how AI can help to improve safety by enhancing situational aware-
ness across all seagoing departments. Improvements in seafarer education
methods, techniques and facilities to enhance realism and broaden the scope
of training can better prepare crew members for the future working with AI
to enhance their skills and employment opportunities.
References
1 Eric Niiler. The Robot Ships Are Coming … Eventually. Wired Magazine,
Business. 30 October 2020. www.wired.com/story/mayflower-autonom
ous-ships/
2 Andrew Tunnicliffe. Using AI to Navigate the Tricky Topic of Ship Navigation.
Ship Technology. 18 April 2019. www.ship-technology.com/features/ship-nav
igation-system/
3 Annie Brown. Utilizing AI and Big Data to Reduce Costs and Increase Profits
in Departments across an Organization. 13 April 2021. Forbes Magazine.
www.forbes.com/sites/sethmatlins/2023/06/22/the-forbes-worlds-most-influent
ial-cmos-list-2023/
4 Allianz. 2012. Safety and Shipping 1912– 2012: From Titanic to Costa
Concordia. Allianz Global Corporate & Specialty. 3 March 2012. www/agcs/
allianz.com/PDFs/Reports/
5 CBI. 2018. Massive Cargo Ships Are Going Autonomous. CBInsights. 28
August 2018. https://app.cbinsights.com/research/autonomous-shipping-
trends/
6 Staff. AI: The Worst-Case Scenario. The Week. https://theweek.com/artificial-
intelligence/1024341/ai-the-worst-case-scenario
7 Matt Egan. Exclusive: 42% of CEOs Say AI Could Destroy Humanity in Five
to Ten Years. CNN Business. 14 June 2023. www.cnn.com/2023/06/14/busin
ess/artificial-intelligence-ceos-warning/index.html
8 Yarden Gross. Why AI Is a First Responder for Overloaded Crews in Shipping’s
Toughest Era Yet in International Shipping News. 19 August 2021. www.helle
nicshippingnews.com/why-ai-is-a-first-responder-for-overloaded-crews-in-
shippings-toughest-era-yet/
9 Alexander Love. AI in Shipping: Areas to Watch in 2020. Ship Technology. 7
January 2020. www.ship-technology.com/features/ai-in-shipping/
10 Jessica Miley. 11 Times AI Beat Humans at Games, Art, Law and Everything
in Between. 12 March 2018. Interesting Engineering, Inc. https://interestingengi
neering.com/innovation/11-times-ai-beat-humans-at-games-art-law-and-eve
rything-in-between
98 Artificial Intelligence in Sensor Systems
11 Lakshmi Varanasi. AI Models Like ChatGPT and GPT-4 Are Acing Everything
from the Bar Exam to AP Biology. 25 June 2023. www.businessinsider.com/
list-here-are-the-exams-chatgpt-has-passed-so-far-2023-1?op=1
12 Brien Posey. What Is a Smart Sensor? Techtarget.com. 29 June 2023. www.tec
htarget.com/iotagenda/defi nition/smart-sensor
13 NCIS. Free Smartwatches for Sailors Might Be Covert Cyberattack. Maritime
Executive. 27 June 2023. www.maritime-executive.com/article/ncis-free-smart
watches-for-sailors-might-be-covert-cyberattack
14 FarSounder 2018. FarSounder Joins NOAA as a Trusted Node. FarSounder,
Inc. Press Release. 17 October 2018. www/farsounder.com/about/press_r
eleases
15 R. Glenn Wright. Scientific Data Acquisition Using Navigation Sonar. IEEE/
MTS Oceans Conference. Anchorage Alaska, USA. September 2017.
16 IBM. What Is Machine Learning? IBM, New York. www.ibm.com/topics/mach
ine-learning
17 Nic Gardner. Current and Future Uses of Artificial Intelligence in the Maritime
Industry. Thetius I.Q. 2021. https://thetius.com/current-and-future-uses-of-art
ificial-intelligence-in-the-maritime-industry/
18 John Nash. Checkmate Humanity: In Four Hours, a Robot Taught Itself Chess,
then Beat a Grandmaster with Moves Never Devised in the Game’s 1,500-year
History. The Daily Mail. 21 December 2017. www.dailymail.co.uk/sciencet
ech/article-5204513/Robot-taught-never-seen-chess-moves-hours.html
19 Coursera. 10 Machine Learning Algorithms to Know in 2023. Coursera,
Inc. Updated 16 June 2023. www.coursera.org/articles/machine-learning-alg
orithms
20 Tero Karras, Samuli Laine, and Timo Aila. A Style-Based Generator Architecture
for Generative Adversarial Networks. NVIDIA. arXiv:1812.04948v3 [cs.NE],
29 March 2019.
21 R. Glenn Wright. In-Stride Detection of Sensor Degradation. GMATEK, Inc.
Final Report. Revision: A, 28 April 2022. Contract: N6833520G2005. Naval
Sea System Command, Washington, DC.
22 Emily A. Vogels. A Majority of Americans Have Heard of ChatGPT, but Few
Have Tried It Themselves. Pew Research Center. 24 May 2023. www.pewresea
rch.org/short-reads/2023/05/24/a-majority-of-americans-have-heard-of-chat
gpt-but-few-have-tried-it-themselves/
23 ChatGPT Fails: 13 Common Errors and Mistakes You Need to Know. Search
Engine Land. Accessed 5 July 2023. https://searchengineland.com/chatgpt-fails-
errors-mistakes-400153
24 Google Bard AI Chatbot Advantages, Disadvantages, Review, Features & Bard
vs ChatGPT. Online Sciences. Accessed 5 July 2023. www.online-sciences.com/
technology/google-bard-ai-chatbot-advantages-disadvantages-review-features-
bard-vs-chatgpt/
25 Guo Chen, Zhigui Liu, Guang Yu, and Jianhong Liang. (2021) “A new view
of multisensor data fusion: research on generalized fusion,” Mathematical
Problems in Engineering, 2021, Article ID 5471242, p. 21. https://doi.org/
10.1155/2021/5471242
Artificial Intelligence in Sensor Systems 99
26 Wen Jiang, Weiwei Hu, and Chunhe Xie. (2017). A new engine fault diagnosis
method based on multi-sensor data fusion. Applied Sciences, 7(3), 280. https://
doi.org/10.3390/app7030280
27 Youyu Zeng and Qiang Xie. Real-Time Ship Fault Diagnosis Algorithm based
on the fusion of CNN and RNN. Proc. 2020 the 10th International Workshop
on Computer Science and Engineering (WCSE 2020). Shanghai 19–21 June
2020, pp. 70–75. doi: 10.18178/wcse.2020.06.012. ISBN 978-981-14-4787-7
28 Yuquan Du, Yanyu Chen, Xiaohe Li, Alessandro Schönborn, and Zhuo
Sun. (2022) Data Fusion and Machine Learning for Ship Fuel Efficiency
Modeling: Part III –Sensor Data and Meteorological Data. Communications
in Transportation Research, 2, p. 100072, ISSN 2772-4247, https://doi.org/
10.1016/j.commtr.2022.100072
29 DNV- Maritime. New Research Project to Investigate Sensor Fusion and
Collision Avoidance for Advanced Ships. DNV. Accessed 29 June 2023. www.
dnv.com/news/new-research-project-to-investigate-sensor-fusion-and-collision-
avoidance-for-advanced-ships-26127
30 P. Roberts. Effects of Artificial Intelligence on Maritime Training. 31 March
2023. Shoreside Maritime Training Magazine. https://shoresidetraining.co.uk/
news/the-effects-of-artificial-intelligence-on-maritime-training/
31 Raymond Antoni Kaspersen (PM), et al. Insights into Seafarer Training and
Skills Needed to Support a Decarbonized Shipping Industry. DNV. Report
no. 2022-0814, rev. 0. 2022-11-04. General Conclusions 3 and 4, p. 8.
32 Marcus Hand. Attracting Future Seafarers and Engine Room Ergonomics.
Seatrade Maritime News. 29 June 2023. www.seatrade-maritime.com/crewing/
attracting-future-seafarers-and-engine-room-ergonomics
33 Sophia Bernazzani, 10 Jobs Artificial Intelligence Will Replace (and 10 That
Are Safe). Blog Hotspot. Updated: 21 February 2020. https://blog.hubspot.
com/marketing/jobs-artificial-intelligence-will-replace
34 The Staunton Standard: Evolution of the Modern Chess Set. World Chess Hall
of Fame. St. Louis, MO, USA. https://worldchesshof.org/exhibit/staunton-stand
ard-evolution-modern-chess-set
35 Ariana Bindman. Incompetent Driverless Cars Are Wreaking Havoc on San
Francisco. SFGATE, 6 June 2023. www.sfgate.com/tech/article/cruise-waymo-
driverless-cars-san-francisco-18132953.php
36 Nearly 400 Car Crashes in 11 Months Involved Automated Tech, Companies
Tell Regulators. The Associated Press. 15 June 2022. www.npr.org/2022/06/15/
1105252793/nearly-400-car-crashes-in-11-months-involved-automated-tech-
companies-tell-regul
37 Ed Leefeldt and Amy Danise. Injured in a Car Accident? AI Is Your Legal
Adversary. Forbes Magazine. 9 June 2023. www.forbes.com/advisor/car-insura
nce/car-accident-injury-artificial-intelligence/
38 Tracey Mayhew. Uncertainty for Maritime Workforce of the Future. MARAD
Symposium, Achieving Critical MASS. 23 July 2019. www.maritime.dot.gov/
sites/marad.dot.gov/files/docs/about-us/foia/11696/seafarers-international-
union-uncertainty-maritime-workforce-future.pdf
100 Artificial Intelligence in Sensor Systems
39 Michael Grey. The Case for Fewer Eyes on the Bridge at Night. Seatrade
Maritime News. 20 June 2023. www.seatrade-maritime.com/crewing/case-
fewer-eyes-bridge-night
40 The Maritime Executive. Will Owners of Autonomous Ships Be Liable for
Autonomous Mistakes? 19 May 2023. https://maritime-executive.com/article/
will-owners-of-autonomous-ships-be-liable-for-autonomous-mistakes
41 MITAGS. How AI Is Changing the Maritime Industry. Maritime Institute of
Technology and Graduate Studies. Linthicum, MD, USA. Blog, News. 10 April
2023. www.mitags.org/ai-impact-maritime-industry/
Chapter 7
Doi: 10.1201/9781003377900-7
102 Shore and Intership Communications
60 m 156.025 160.625 x x
1 m 156.05 160.65 x x
1001 156.05 156.05 x
61 m 156.075 160.675 x x
2 m 156.1 160.700 x x
62 m 156.125 160.725 x x
3 m 156.15 160.750 x x
63 m 156.175 160.775 x x
1063 156.175 156.175 x
4 m 156.2 160.800 x x
64 m 156.225 160.825 x x
5 m 156.25 160.850 x x
1005 156.25 156.25 x
65 m 156.275 160.875 x x
1065 156.275 156.275 x
6 f 156.3 x
2006 r 160.900 160.900
66 m 156.325 160.925 x x
1066 156.325 156.325 x
7 m 156.35 160.950 x x
1007 156.35 156.35 x
67 h 156.375 156.375 x x
8 156.4 x
68 156.425 156.425 x
9 i 156.45 156.45 x x
69 156.475 156.475 x x
10 h,q 156.5 156.5 x x
70 f,j 156.525 156.525 Digital selective calling for distress
11 q 156.55 156.55 X
71 156.575 156.575 X
12 156.6 156.6 X
72 i 156.625 x
13 k 156.65 156.65 x x
73 h,i 156.675 156.675 x x
14 156.7 156.7 x
74 156.725 156.725 x
15 g 156.75 156.75 x x
75 n,s 156.775 156.775 x
16 f 156.8 156.8 Distress, safety and calling
76 n,s 156.825 156.825 x
17 g 156.85 156.85 x x
77 156.875 x
18 m 156.9 161.5 x x
1018 156.9 156.9 x
78 m 156.925 161.525 x x
1078 156.925 156.925 x
2078 mm 161.525 161.525 x
19 m 156.95 161.55 x x
1019 156.95 156.95 x
2019 mm 161.55 161.55 x
x 79 m 156.975 161.575 X x
x 1079 156.975 156.975 X
2079 mm 161.575 161.575 X
x 20 m 157 161.6 X x
x 1020 157 157 x
x 2020 mm 161.6 161.6 x
x 80 157.025 161.625 x x
1080 157.025 157.025 x
x 21 y,wa 157.05 161.65 x x
1021 157.05 157.05 x
x 81 y,wa 157.075 161.675 x x
x 1081 157.075 157.075 x
x 22 y,wa 157.1 161.7 x x
1022 157.1 157.1 x
x 82 x,y,wa 157.125 161.725 x x
1082 157.125 157.125 x
23 x,y,wa 157.15 161.75 x x
1023 157.15 157.15 x
x 83 x,y,wa 157.175 161.775 x x
1083 157.175 157.175 x
x 24 w,wx,x, 157.2 161.8 x x
1024 w,wx,x, 157.2
2024 w,wx,x, 161.8 161.8 x (digi
xx
84 w,wx,x, 157.225 161.825 x x
1084 w,wx,x, 157.225 x (digi
xx
2084 w,wx,x, 161.825 161.825
25 w,wx,x, 157.25 161.85 x x
1025 w,wx,x, 157.25 x (digi
xx
2025 w,wx,x, 161.85 161.85
85 w,wx,x, 157.275 161.875 x x
1085 w,wx,x, 157.275
2085 w,wx,x, 161.875 161.875
26 w,ww, 157.3 161.9 x x
1026 w,ww, 157.3
2026 w,ww, 161.9
86 w,ww, 157.325 161.925 x x
1086 w,ww, 157.325
2086 w,ww, 161.925
27 z,zx 157.35 161.95 x
1027 z,zz 157.35 157.35
x ASM 2 z 161.95 161.95
87 z,zz 157.375 157.375 x
x 28 z,zx 157.4 162 x
1028 z,zz 157.35 157.35 x
ASM 2 z 162 162
x 88 z,zz 157.425 157.425 x
AIS 1 f, l, p 161.975 161.975
AIS 2 f, l, p 162.025 162.025
108 Shore and Intership Communications
2 MHz Simplex 2.003 kHz 2.142 kHz 2.214 kHz 2.738 kHz
Channels 2.0825 kHz 2.182 kHz 2.635 kHz 2.782 kHz
2.086 kHz 2.203 kHz 2.638 kHz 2.830 kHz
2.093 kHz
4 MHz Duplex Channels 401–4 27 4.357–4 .435 kHz 4.068–4 .143 kHz 3 kHz spacing
4 MHz Duplex Channel 428 4.351 kHz Varies
4 MHz Duplex Channel 429 4.354 kHz Varies
4 MHz Simplex Channel 4.146–4 .149 kHz 3 kHz spacing
1 through 21
4.000–4 .060 kHz
4 MHz Shared Channels 3 kHz spacing
4 MHz (Calling) 421 4.147 kHz 4.125 kHz Distress and safety working
4.125 kHz simplex
4 MHz (USCG Calling) 424 4.435 kHz 4.134 kHz
6 MHz Duplex Channels 601–6 08 6.501–6 .522 kHz 6.200–6 .212 kHz 3 kHz spacing
6 MHz Simplex Channel 6.224–6 .230 kHz 3 kHz spacing
8 MHz Duplex Channels 801–8 32 8.719–8 .812 kHz 8.195–8 .291 kHz 3 kHz spacing
Distress 833 8.291 kHz Distress and safety working
8.291 kHz simplex
8 MHz Duplex Channels 834–8 37 8.707–8 .716 kHz Varies
8 MHz Simplex Channel 8.294–8 .297 kHz 3 kHz spacing
8 MHz (USCG Calling) 816 8.764 kHz 8.240 kHz
8 MHz (Calling) 821 8.779 kHz 8.255 kHz
12 MHz Duplex Channels 1201–1 241 13.077–1 3.197 kHz 12.230–1 2.350 kHz 3 kHz spacing
12 MHz Simplex Channels 12.353–1 2.365 kHz 3 kHz spacing
12 MHz (USCG Calling) 1205 13.089 kHz 12.242 kHz
12 MHz (Calling) 1221 13.137 kHz 12.290 kHz Distress working 12.290
kHz simplex
111
(Continued)
newgenrtpdf
Table 7.3 (Continued)
112
ITU Channel No. Coast Transmit Ship Transmit Notes
18/1 9 MHz Duplex Channels 1801–1 815 19.755–1 9.797 kHz 18.780–1 8.822 kHz 3 kHz spacing
18/1 9 MHz Simplex Channels 18.825–1 8.843 kHz 3 kHz spacing
18/1 9 MHz (Calling) 1806 19.770 kHz 18.795 kHz
22 MHz Duplex Channels 2201–2 253 22.696–2 2.852 kHz 22.000–2 2.156 kHz 3 kHz spacing
22 MHz Simplex Channels 22.159–2 2.177 kHz 3 kHz spacing
22 MHz (Calling) 2221 22.756 kHz 22.060 kHz
25/2 6 MHz Duplex Channels 2501–2 510 26.145–2 6.172 kHz 25.070–2 5.097 kHz 3 kHz spacing
25/2 6 MHz Simplex Channels 25.100–2 5.118 kHz 3 kHz spacing
25/2 6 MHz (Calling) 2510 26.172 kHz 25.097 kHz
Shore and Intership Communications 113
time of day and atmospheric conditions and can vary from only short dis-
tances to several thousand miles.
The primary method of broadcasting Marine Safety Information (MSI)
is NAVTEX, which can provide coverage up to around 400 nautical miles
out to sea although this depends on many reception issues. All NAVTEX
broadcasts are made on 518 kHz, using narrow-band direct printing 7-unit
forward error correcting (FEC or Mode B) transmission. This service is com-
plemented by radio telephony with networks of remote radio sites around
coasts providing MF coverage of 150 miles.
HF radiofacsimile broadcasts originating from various worldwide loca-
tions are shown below.16 All emissions are amplitude modulation, single side-
band, suppressed carrier, analog (J3C) and frequencies are subject to change.
Australia: 2,628 kHz, 5,100 kHz, 5,755 kHz, 7,535 kHz, 10,555
kHz, 11,030 kHz, 13,920 kHz, 15,615 kHz, 20,469 kHz,
18,060 kHz.
Canada: 3,253.0 kHz, 4,271 kHz, 4,416 kHz, 4,292.0 kHz,
6,915.1 kHz, 6,496.4 kHz, 7,710.0 kHz, 8,456.0 kHz,
10,536 kHz, 13,510 kHz.
China: 4,199.75 kHz, 8,412.5 kHz, 12,629.25 kHz,
16,826.25 kHz.
Chile: 4,228.0 kHz, 4,322.0 kHz, 8,677.0 kHz, 8,696.0 kHz,
17,146.4 kHz.
Greece: 4,481 kHz, 8,105 kHz.
Germany: 3,855 kHz, 7,880 kHz, 13,882.5 kHz.
Japan: 148 kHz, 3,622.5 kHz, 7,795 kHz, 13,988.5 kHz.
Japan/Singapore: 4,316 kHz, 8,467.5 kHz, 12,745.5 kHz, 16,035 kHz,
16,971 kHz, 17,069.6 kHz, 17,430 kHz, 22,542 kHz.
Korea Republic: 3,585 kHz, 5,857.5 kHz, 7,433.5 kHz, 9,165 kHz,
13,570 kHz.
New Zealand: 3,247.4 kHz, 5,807 kHz, 9,459 kHz, 13,550.5 kHz,
16,340.1 kHz.
Russia: 4,481 kHz, 5,336 kHz, 6,446 kHz, 7,907 kHz, 7,908.8
kHz, 8,105 kHz, 8,444 kHz, 10,130 kHz.
Thailand: 7,395 kHz.
United Kingdom: 2,618.5 kHz, 4,610.0 kHz, 8,040.0 kHz, 11,086.5 kHz.
United States: 2,054 kHz, 4,298 kHz, 4,317.9 kHz, 4,346 kHz, 8,459
kHz, 8,503.9 kHz, 8,682 kHz, 12,412.5 kHz, 12,786 kHz,
12,789.9 kHz, 17,146.4 kHz, 17,151.2 kHz, 22,527 kHz.
be used for navigation purposes. A list of many of these signals and the fre-
quencies on which they are broadcast appears below:
weather conditions, and enable distress calls with pertinent location and iden-
tification information with the push of a button. GMDSS is comprised of safety
procedures, types of equipment and communication protocols used for safety
and rescue operations of distressed ships, boats and aircraft. It also allows ships
to be located quickly and efficiently in the event of a distress situation. A major
component is the COSPAS-SARSAT satellite search and rescue (SAR) system,
where instruments are flown on board LEO, MEO and GEO satellites provided
by the United States, Russian Federation, India and the European Union.27
These satellites are capable of detecting signals coming from the Earth’s sur-
face transmitted by an emergency Search and Rescue Transmitter (SART) that,
when activated, broadcasts a distress beacon on a frequency of 406 MHz to
send a signal that indicates the position of a distressed vessel.
All GMDSS ships must carry a 406 MHz Emergency Position Indicating
Radiobeacon (EPIRB), a VHF radio capable of transmitting and receiving
DSC and radiotelephony, a Navigational Telex (NAVTEX) receiver, a SART,
backup power systems and two-way VHF portable radios. In GMDSS Sea
Area 2, where radiotelephone coverage of at least one MF coast station in
which continuous DSC (2,187.5 kHz) alerting and radiotelephony services
are available, ships must also carry a DSC-equipped MF radiotelephone.28
For all voyages conducted in GMDSS Sea Areas 3 and 4, vessels must carry
either an Inmarsat F77, B or C ship earth station, or a DSC-equipped HF
radiotelephone/telex.28
Companies such as Inmarsat/Viasat using radionavigation transponders
can enable Satellite-Based Augmentation System (SBAS) services around the
world for coast guard and other organizations to enhance standard GPS/
Galileo accuracy of 5–10 meters to as little as 10 cm using satellite connect-
ivity, land-based infrastructure and software.30 Precise tracking such as this
could enable pinpoint safety navigation and help emergency services reach
vessels in distress more quickly.
a (Top) Pilot vessel –COLREGS Rule 29. (Middle) Sailing vessel under-
way, or single light with three quadrants: white, red and green –Rule
25. (Bottom) Vessel at Anchor –Rule 30, or power-driven vessel of less
than 7 meters in length whose maximum speed does not exceed 7 knots –
Rule 23.
b Vessel not under command or restricted in their ability to maneuver –
Rule 27.
c Vessel constrained by her draft –Rule 28. Vessel aground –Rule 30.
d Vessel engaged in mine clearance operations (<50 m) –Rule 27.
e Vessel proceeding under sail when also being propelled by machinery,
except that a vessel of less than 12 meters in length is not required to
exhibit the dayshape –Rule 25.
f (Top) Vessel engaged in trawling. (Bottom) Vessel engaged in fishing other
than trawling –Rule 26.
g Same as f. except additional daymark indicating gear extended beyond
150 meters horizontally from the vessel –Rule 26.
These specific visual signals when combined with navigation lights while
making way that reflects various vessel configurations must be distinguished
from deck lighting and other background lights likely to be encountered.
This requires knowledge and ability that is relatively easily taught by educa-
tors and learned by mariners as part of their normal training and experience.
However, in addition to the basic steering and sailing rules of the COLREGS
Part B, these Lights and Shapes requirements of the COLREGS Part C must
be included in all training regimens for crewed vessels and unmanned and
autonomous vehicles using visible light cameras with the development pro-
cess properly verified and the results validated and tested. This includes train-
ing encompassing all ship viewing perspectives at sea through 360° in varying
visibility and weather conditions before such machines should be released
onto the high seas and inland waterways. Some research in this regard has
been published in the literature, but its scope is limited with further investi-
gation still needed.31
thereby position. These characteristics include green and red lights that mark
the sides of channels and locations of wrecks or obstructions that are to be
passed by keeping these lights on the port (left) hand of a vessel. Green and
red lights are also used on Preferred Channel Marks, where the topmost band
is green and red, respectively. Yellow or white lights having no lateral signifi-
cance can also be displayed on AtoN, whose purpose may be determined by
their shape, color, letters or numbers, and the light rhythm employed.
Light rhythms can vary greatly displaying both regularly flashing or regu-
larly occulting light rhythms. Some examples include when flashing lights
(frequency not exceeding 30 flashes per minute) may also be used, and for
situations where lights require a distinct cautionary significance such as at
sharp turns, sudden channel constrictions, wrecks, or obstructions, a quick
flashing light rhythm (60 flashes or more per minute) may be used. Composite
group flashing light rhythms consisting of groups of two flashes followed by
one flash may be found on some AtoN in addition to Safe Water Marks
that display a white Morse Code “A” rhythm (short-long flash) and Isolated
Danger Marks that display a white group flashing two flashes. Other lights
include those used on Special Marks that display yellow lights with fixed
or slow flashing rhythms, mooring buoys and information and regulatory
marks displaying white lights of various rhythms.
Lights containing sector guidance showing one color from most directions
and a different color or colors over definite arcs of the horizon as indicated
on the appropriate nautical chart are sometimes used to mark shoals or warn
mariners of other dangers. Approximate bearing information is provided
where the observer can note a change of color as the boundary between the
sectors is crossed.
Some research in the area of computer-based recognition of AtoN lights
and colors has appeared in the literature.33 However, there still exists much
work to be done before unmanned and autonomous vehicles will be capable
of dealing with visual AtoN recognition at night. Further information on the
IALA A and B system of buoyage and aids to navigation is available from
IALA.34
D (Delta)
Keep clear of me. I am maneuvering with difficulty.
E (Echo)
Altering course to starboard.
F (Foxtrot)
I am disabled, communicate with me.
G (Golf)
I want a pilot.
H (Hotel)A pilot on board.
I (India)
I am altering my course to port.
J (Juliet)
Vessel on fire keep clear.
K (Kilo)
I want to communicate with you.
L (Lima)
Stop your vessel instantly, I have something important to
communicate.
M (Mike) My vessel is stopped and making no way through the water.
N (November) No (negative).
O (Oscar) Man overboard.
P (Papa) In port: All personnel return to ship; vessel is about to sail;
At sea: It may be used by fishing vessels to mean: “My nets
have come fast upon an obstruction”.
Q (Quebec) I request free pratique.
R (Romeo) Reverse course.
S (Sierra) Engines are going astern.
T (Tango) Keep clear; engaged in trawling.
U (Uniform) You are heading into danger.
V (Victor) Require assistance.
W (Whiskey) Require medical assistance.
X (X-ray) Stop your intention.
Y (Yankee) Am dragging anchor.
Z (Zulu) I require a tug.
Visual and audible signals within the line of sight of potential rescuers
may provide the most expeditious response, but they may also be subject to
misunderstanding by untrained observers and automated assistants that can
result in a significant lag in time before the significance of distress signals are
made clear, if at all. In addition to these local signals, distress signals should
be sent to coast guards and regional search and rescue authorities using the
more reliable and noticeable long-range emergency notification capabilities
provided through VHF and HF radio communications, Inmarsat, and other
satellite service providers.
Despite their effectiveness, the reliability of detection is reduced since
monitoring the horizon to detect visual and audible distress signals is mainly
accomplished by bridge watchstanders and observers who happen to gaze in
the right direction at the right time. Smart visible light and infrared cameras
with 360° coverage of the horizon and trained in flare, fire, and smoke detec-
tion using AI techniques can greatly assist in detecting such events through
continuous monitoring and the ability to immediately notify watchstanders
of such an occurrence. Camera coverage is becoming more widespread as
onboard security interests increase, and integration of visual signal compre-
hension technology can be accomplished relatively easily. Likewise, moni-
toring of audio to detect and locate gunfire is already possible and in use
throughout many of the larger cities of the world. Similar methods can be
used to train smart microphones and audio systems to detect ship horns emit-
ting warning, danger and distress signals.
• One short blast (and light flash of at least one second’s duration, if
equipped) to mean, “I am altering my course to starboard”.
• Two short blasts (and light flashes, if equipped) to mean, “I am altering
my course to port”.
• Three short blasts (and light flashes, if equipped) to mean, “I am oper-
ating astern propulsion”.
A yellow light may be used in lieu of a white light in some world regions
depending on national regulations.
When vessels are in sight of one another in a narrow channel or fairway, a
vessel intending to overtake another shall indicate intentions by the follow-
ing signals:
• Two prolonged blasts followed by one short blast (and light flashes, if
equipped) to mean, “I intend to overtake you on your starboard side”.
• Two prolonged blasts followed by two short blasts (and light flashes, if
equipped) to mean, “I intend to overtake you on your port side”.
The vessel about to be overtaken shall indicate her agreement by the fol-
lowing signal on her whistle:
• One prolonged, one short, one prolonged and one short blast (and light
flashes, if equipped), in that order.
When vessels are in sight of one another and approaching each other
where, from any cause, either vessel fails to understand the intentions or
actions of the other, or is in doubt whether sufficient action is being taken
by the other to avoid collision, the vessel in doubt shall immediately indicate
such doubt by giving at least five short and rapid blasts (and light flashes, if
equipped).
Also, a vessel, when nearing a bend or an area of a channel or fairway
where other vessels may be obscured by an intervening obstruction shall
sound one prolonged blast. Such signal shall be answered with a prolonged
blast by any approaching vessel that may be within hearing around the bend
or behind the intervening obstruction.
These signals presuppose proper training of seafarers in the detection of
horns and lights that includes the ability to discern their presence in the midst
of background clutter noise. Bridge aids for watchstanders along with view-
ing and listening technology on unmanned and autonomous vehicles using
directional microphones can narrow the relative bearing of the signal, and
124 Shore and Intership Communications
one or more cameras can be used to positively identify the signaling vessel.
Training of AI-based smart microphones using the specific horn blast pat-
terns and frequencies designated for use by horns, bells and gongs combined
with smart camera training using these same patterns represented by white
lights can today make possible the automated detection and identification of
these signals using existing technology.
AIS sensor data. Training of smart microphones on the specific patterns and
frequencies mandated for horn, bell and gong use under these circumstances
can result in speedy detection and positive identification of nearby vessels.
7.8 Summary
This chapter has presented many different modes of communication as well
as visual and audible messaging and content that can take years for trained
mariners to master. Research has been performed but few case studies exist
where attempts to automate individual tasks have been overwhelmingly suc-
cessful, and such efforts continue. However, these technologies are at best
still experimental and as of yet are unproven.
Also notable is the fact that, with the introduction of unmanned and
autonomous technologies into the maritime mix, it appears that exceptions
are being made at the national levels to eliminate the need to possess full
knowledge of the requirements, capabilities, applications and means of use
for all communication methods expected of seagoing vessels. The full signifi-
cance in granting such exceptions to overall maritime safety and safety of
navigation is yet to be determined.
References
1 Gavin Wright. Metropolitan Area Network (MAN), definition. TechTarget.
February 2021. www.techtarget.com/searchnetworking/defi nition/metropoli
tan-area-network-MAN
2 Mike Wall, 2022. SpaceX Gets Permission to Deploy 7,500 Next-Generation
Starlink Satellites. Space.com. 2 December 2022. www.space.com/spacex-fcc-
approval-7500-starlink-satellites
3 Mike Schuler, 2023. Viasat Completes Acquisition of Inmarsat. gCaptain. 31
May 2023. https://gcaptain.com/viasat-completes-acquisition-of-inmarsat/
126 Shore and Intership Communications
4 Marcus Hand. Are Hybrid VSAT and LEO Networks the Next Maritime
Communications Solution? Seatrade Maritime News. 02 June 2023. www.seatr
ade-maritime.com/technology/are-hybrid-vsat-and-leo-networks-next-marit
ime-communications-solution?
5 US Coast Guard, 2008. Coast Guard Announces Successful Launch of
Nationwide AIS Satellite. Coast Guard News. 23 January 2002, https://coa
stguardnews.com/coast-guard-announces-successful-launch-of-nationwide-ais-
satellite/
6 Daniel O’Donohue. How Cargo Ships and Marine Vessels Are Tracked.
Mapscaping. 9 September 2022. https://mapscaping.com/tracking-ships-and-
vessels/
7 Rolls-Royce, 2017. Rolls Royce Demonstrates Worlds First Remotely Operated
Commercial Vessel. Press Release. 20 June 2017. www.rolls-royce.com/media/
press-releases/2017/20-06-2017-rr-demonstrates-worlds-first-remotely-opera
ted-commercial-vessel.aspx
8 Wärtsilä, 2017. Successful Test of Remote Control Ship Operating Capability.
Wärtsilä Marine Solutions. Wärtsilä Electrical and Automation. September
2017. www.wartsila.com/docs/default-source/marine-documents/business-
white-papers/white-paper-o-ea-remote-control-ship.pdf?sfvrsn=16669e45_6
9 Mayflower 2022. It’s Time for the Mayflower Autonomous Ship. www.mas
400.com/technology
10 Iridium, 2021. The Mayflower Autonomous Ship Transforms Ocean Science
with the Help of Iridium. Iridium Communications Inc, Press release, 27 May
2021. https://investor.iridium.com/2021-05-27-The-Mayflower-Autonomous-
Ship-Transforms-Ocean-Science-with-the-Help-of-Iridium-R
11 US Coast Guard, 2023. VHF International Transmitting Frequencies. US Coast
Guard Navigation Center. Department of Homeland Security. www.navcen.
uscg.gov/international-vhf-marine-radio-channels-freq
12 DSC, 2023. United States Coast Guard, Navigation Center. U.S. Department of
Homeland Security. Digital Select Calling. https://navcen.uscg.gov/digital-select
ive-calling
13 Rescue 21. U.S. Coast Guard, Acquisitions Directorate. Homeland Security.
www.dcms.uscg.mil/Our-Organization/Assistant-Commandant-for-Acquisiti
ons-CG-9/Programs/C4ISR-Programs/Rescue-21/
14 EMSA, 2023. Enhanced Maritime Picture via Integrated Maritime Services.
European Maritime Safety Agency. www.emsa.europa.eu/we-do/digitalisation/
maritime-monitoring.html
15 Electronics Desk, 2023. Duct Propagation. https://electronicsdesk.com/duct-
propagation.html
16 Worldwide Radiofacsimile Broadcast Schedules, National Oceanographic and
Atmospheric Administration (NOAA), Bethesda, MD, USA. 21 October 2022.
17 CHU. National Research Council. https://nrc.canada.ca/en/certifications-eval
uations-standards/canadas-official-time/nrc-shortwave-station-broadcasts-chu
18 BPM. Chinese Academy of Sciences. www.sigidwiki.com/wiki/BPM
19 DCF. Physikalisch- Technische Bundesanstalt (PTB), www.ptb.de/cms/en/ptb/
fachabteilungen/abt4/fb-44/ag-442/dissemination-of-legal-time/dcf77/
Shore and Intership Communications 127
The previous chapter discussed how sensor and other data may be com-
municated beyond the confines of the vessel itself to other ships and shore
facilities using satellite, radio, visual and other methods. However, internal
communication of sensor data representing functions associated with engin-
eering, navigation, cargo, and human activity comprise a very large subset
of overall vessel communications. Security and crowd control technologies
found worldwide at military installations, financial institutions, government
facilities and even amusement parks have also been tailored adapted to these
same applications on ships. This is accomplished using a combination of
dedicated circuits and common data bus architectures utilizing both wired
and wireless transmission modes tailored to the specific needs of the sensors
communicating their measurements and observations.
The independent and self-contained environment found onboard surface
ships and vehicles provides a unique combination of data communication
requirements not found elsewhere. These requirements are supported using
protocols, including Ethernet, developed to support high-speed data com-
munications between computers and peripheral devices such as disk drives
and printers. These protocols are general in nature and have been adapted
to accommodate the needs of sensors and sensor systems. Others, such as
the Controller Area Network (CAN) and their derivatives that include the
National Marine Electronics Association (NMEA) 0183 and 2000 data bus
architectures, are tailored to support the maritime-specific needs of engin-
eering and navigation sensors as they evolved to accommodate expanding
requirements associated with vessel automation and digitalization.
Data bus technology developed for the operation and automation of indus-
trial control systems that include RS-485 form the basis for monitoring many
different types of bulk, liquid, gas and other types of cargo found onboard
vessels of all sizes, types and kinds. USB (Universal Serial Bus) and Bluetooth
wireless technology provide access to individual network system devices that
can be a gateway to the entire ship’s network infrastructure. Wireless Fidelity
(Wi-Fi) technology enables wireless access to the Internet and local network
resources by enabling devices to connect to a local area network (LAN).
Doi: 10.1201/9781003377900-8
130 Intraship and Internal Communications
• Data Quality
• Data Rate
• Latency
• Bandwidth
• Security
• Flexibility
In addition, the physical characteristics of the data bus can also affect its
performance. For example, the length of the data bus and the distance data
must travel can affect latency, and the type of cable used can affect bandwidth
characteristics. Specific factors most important for a particular application
depend on the inherent requirements associated with the sensors attached
to the bus, the nature of sensor data communicated therein, the length of
the communications channel and other aspects of the application. As a rule,
a sensor data bus featuring a low error rate, high data rate, low latency,
high bandwidth and strong security will be more powerful than a data bus
with lesser performance characteristics. Note that these factors are associated
with the communication channel itself and are considered independent and
apart from sensor performance characteristics identified in Chapter 2 (2.3)
including linearity, range, accuracy, range, sensitivity, etc.
is often updated 20 or more times per second, with each update producing
large amounts of data that must be communicated at a high speed to main-
tain flow and continuity to the viewer or real-time image analysis processors
when capturing fast-changing phenomena. Each individual sensor has speci-
fied resolution and accuracy requirements that must be supported by avail-
able bandwidth and network capacity to avoid data congestion and potential
data loss. Another aspect of this issue is the capability of the network to store
and process the incoming data, where higher data rates may require larger
storage capacities and efficient data management strategies. These issues
must also be considered within the framework of existing regulatory require-
ments and standards that dictate specific data rates to ensure interoperability
exists between instruments of different manufacturers.
8.2.3 Bandwidth
Whereas data rate is the speed at which a sensor produces data to be sent
across a communication channel, bandwidth is the maximum amount of data
that can be transferred over the data bus at a given time. Bandwidth is gen-
erally measured in terms of kilobits per second (kbps), megabits per second
(Mbps) or gigabits per second (Gbps). A higher bandwidth allows more data
to be transferred simultaneously, which can improve system scalability. It is
critical that data bus bandwidth is adequate to support substantially more
than the maximum amount of data that may be produced by the sensors inte-
grated into the system to ensure the system does not get overwhelmed with
excessive data that will bog down sensor system performance.
8.2.4 Latency
Latency is the delay in time between when sensor data is generated and when
it is delivered to the end user, be it a human being or another electronic
system whose purpose is to process the data. Lower latency allows data to
be transferred more quickly, which can improve the responsiveness of the
system. Adequate bandwidth capacity to handle anticipated sensor data rate
requirements is necessary to ensure network latency is minimized.
8.2.5 Security
Data bus security is paramount to protect sensor data from unauthorized
access and to prevent sensor system hacking and hijacking. A secure data
bus uses encryption, authentication, access control, firewalls, intrusion detec-
tion and prevention systems, secure protocols, and other security measures
to protect the data and the sensors that generate the data. Robust security
measures require a comprehensive approach that considers all aspects of the
sensor data bus application, from physical security to network protocols and
134 Intraship and Internal Communications
8.2.6 Flexibility
The flexibility of a data bus is a measure of how it may be adapted to dif-
ferent applications and configured to meet the specific requirements of the
application. However, unique data bus architectures have been designed to
optimize specific types of sensor data. For example, the CAN bus formed the
basis for the creation of the NMEA-2000 bus, which is designed specifically
for maritime use by electronics associated with navigation and engineering
sensors. A similar adaptation known as J1939 was also created by the Society
of Automotive Engineers (SAE) for commercial vehicle use.6 CAN proved to
be quite flexible in its design but didn’t meet the specific needs of either the
maritime or automotive communities and was adapted to each application.
• Ethernet
• CAN
Intraship and Internal Communications 135
• NMEA-0183
• NMEA-2000
• RS-485
• USB Port
• Bluetooth
• Wi-Fi
Characteristic Standard
Ethernet CAN NMEA NMEA 2000 RS-4 85 USB 3.1 Bluetooth 5.x Wi- F i
0183
100BaseTX 1000BaseT 1GBaseT
Data Rate 100 Mbps 1000 Mbps 10 Gbps 1 Mbps 9.8 kbps 250 kbps up to 10 10 Gbps 2 Mbps 11–3 000
Mbps Mbps
Data Format Binary Binary Binary Binary ASCII Binary Various Serial Packet Binary
packet
Protocol IEEE 802.3 IEEE 802.3 IEEE 802.3 Message Sentence Message Various Master/ Master/ IEEE 802.11
Slave Slave
Compatibility Widespread Widespread Widespread Moderate Moderate Widespread Widespread Widespread Widespread Widespread
Topology Several Several Several Bus Point-t o- Star Several Low Star Star
Point
Fault Moderate Moderate Moderate High Low Low High Moderate Moderate Moderate
Tolerance
Security High High High Moderate Low High None Low Moderate Moderate
Intraship and Internal Communications 137
or other host devices.7 On board ships it is generally used for updating navi-
gation charts and maintenance manuals, and for servicing many types of
electronic equipment.
Bluetooth is a wireless communication technology that allows electronic
devices to exchange data over short distances without the need for physical
connections. It is used in much the same manner as USB, except that it does
not have to be physically connected for access to be achieved to a Bluetooth
device.
Wireless Fidelity (Wi- Fi) is a wireless communication technology that
allows devices to connect to a LAN and the Internet without the need for
wired connections. It is based on the IEEE 802.11 family of standards, which
defines the specifications for wireless networking.
Many standards also fall under the purview of the International
Electrotechnical Commission (IEC), which develops and publishes inter-
national standards for electrical, electronic and related technologies,
including:
• IEC 60945: Standards for regulating all equipment used in marine areas.
• IEC 61158: A series of related standards that cover aspects of RS-485
that define the fieldbus standard for digital communication in industrial
automation.
• IEC 61162 (Maritime Navigation and Radiocommunication Equipment
and Systems): A series of standards that specify the data communication
protocols used for shipboard sensors and systems.
• IEC 61162- 1: General requirements for data acquisition systems for
marine applications covering the physical layer, the data link layer and
the application layer for NMEA-0183.
• IEC 61162-2: Requirements for data acquisition systems for marine appli-
cations that use the Ethernet protocol.
• IEC 61162- 3: Requirements for data acquisition systems for marine
applications that use the CAN bus protocol and the NMEA-2000 imple-
mentation of this standard.
8.5 Ethernet
Ethernet is a family of wired computer networking technology and protocols
commonly used on ships for connecting devices on a LAN. First introduced
in 1980 and standardized in 1983 as IEEE 802.3 and now also referred to
as ISO/IEC8802-3, it uses physical media that includes twisted-pair copper
wiring, coaxial cable and fiber optic cable. The data rate of Ethernet varies
depending on the type of physical media and the version of the standard.
138 Intraship and Internal Communications
Older versions of the standard that are no longer used in modern networks
include 10Base-T and 10Base-F, which are limited to speeds up to 10 Mbps.
Relevant contemporary standards include:8
Ethernet hubs were used to connect multiple Ethernet devices in a LAN, but
these have been largely replaced by Ethernet switches due to their superior
performance and features. Ethernet hubs operate at the physical layer (Layer
1) of seven layers described in the Open Systems Interconnection (OSI) model
computer systems use to communicate over a network.9 Ethernet switches
operate at the physical layer (Layer 1) and the data link layer (Layer 2) to pro-
vide intelligent data forwarding and dedicated bandwidth to each connected
device, making them far more efficient and scalable than older Ethernet hubs.
Also used are routers used to connect multiple networks and direct data
traffic between them. A router operates at the network layer (Layer 3) of the
OSI model and plays a critical role in facilitating communication between
devices and networks in complex network environments. Some Ethernet
switches offer Power over Ethernet (PoE) capabilities, providing power to
connected devices and sensors such as IP cameras and wireless access points,
eliminating the need for separate power supplies.
The OSI and Transmission Control Protocol/Internet Protocols (TCP/IP)
are two different but related networking models associated with the Ethernet
protocol. The OSI model is a conceptual framework that standardizes how
different networking protocols should interact and communicate with each
other, while TCP/IP is a suite of specific set of protocols used to interconnect
network devices on the Internet.10 While serving different purposes, the OSI
and TCP/IP models have not entirely replaced the other. TCP/IP has become
the dominant suite of protocols used for Internet communication, while the
OSI model remains a basic reference model for understanding networking
concepts and protocol interactions.
protocol that supports distributed real-time control and multiplexing for use
within vehicles and other applications. The standard defines the data format
and protocol that are used to transmit data over the bus and specifies the
Classical CAN frame format and the newly introduced CAN Flexible Data
Rate Frame format. The Classical CAN frame format allows bit rates up
to 1 Mbit/s and payloads up to 8 bytes per frame. The Flexible Data Rate
frame format allows bit rates higher than 1 Mbit/s and payloads longer than
8 bytes per frame. Also described is the general architecture of CAN in terms
of hierarchical layers according to the ISO reference model for open systems
interconnection (OSI) according to ISO/IEC 7498-1.11
In the restructured ISO 11898 series most applicable to maritime
applications:
• Part 1 defines the data link layer, including the logical link control (LLC)
sub-layer and the medium access control (MAC) sub-layer, as well as the
physical signaling (PHS) sub-layer.
• Part 2 defines the high-speed physical medium attachment (HS-PMA).
• Part 3 defines the low-speed fault-tolerant PMA.
8.9 RS-4 85
RS-485 is an industrial specification that defines the electrical interface
and physical layer for point-to-point communication of electrical devices
over long cabling distances in electrically noisy environments.16 It is used
onboard ships as a means to connect many different types of sensors for
cargo and ship infrastructure monitoring. A serial communication standard
that defines the electrical characteristics of a balanced differential voltage
interface, it can support up to 32 devices connected to the same ISO 8482
Intraship and Internal Communications 141
8.11 Bluetooth
Bluetooth is a wireless technology that allows devices to communicate with
each other over short distances.18 It performs many of the same functions
as a USB port and also provides the capability to connect sensors on ships.
It is also susceptible to the same risks as USB in providing access to attack
ship’s computer systems and networks through hacking and cyberattacks.
Operating in the unlicensed 2.4 GHz radio frequency band for industrial,
142 Intraship and Internal Communications
References
1 MSC 67/22/Add.1, ANNEX 17. Resolution MSC.64(67), 4 December 1996.
Add 1. ANNEX 1. Recommendation on Performance Standards for Integrated
Bridge Systems (IBS).
2 SOLAS chapter V adopted in December 2000.
3 Joint Committee Guides Metrology: Evaluation of Measurement Data-Guide
to the Expression of Uncertainty in Measurement (GUM 2008), 2008.
4 H.Y. Teh, A.W. Kempa- Liehr, and K.IK. Wang. (2020) Sensor data
quality: A systematic review. Journal of Big Data 7, 11. https://doi.org/10.1186/
s40537-020-0285-1
5 Data Rate. PCMag Encyclopedia. PC Magazine. www.pcmag.com/encyclope
dia/term/data-rate
6 SAE J1939 –Recommended Practice for a Serial Control and Communications
Vehicle Networks. J1939 Digital Annex J1939DA_ 202305. Society of
Automotive Engineers (SAE). 10 May 2023. www.sae.org/standards/content/
j1939da_202305/
7 USB4® Specification V2.0. USB Implementers Forum, Inc. (including trade-
mark). 30 June 2023. www.usb.org/document-library/usb4r-specifi cation-v20
8 Ethernet Standards and Protocols Explained. Computer Networking Notes,
2023. www.computernetworkingnotes.com/networking-tutorials/ethernet-
standards-and-protocols-explained.html
9 Keith Shaw. The OSI Model Explained and How to Easily Remember Its 7
Layers. 14 March 2022, Network World. www.networkworld.com/article/
3239 6 77/ t he- o si- m odel- e xplai n ed- a nd- h ow- t o- e as i ly- r emem b er- i ts- 7 - l ay
ers.html
10 Mary E. Shacklett. What Is TCP/IP? TechTarget, Networking. July 2021. www.
techtarget.com/ searchnetworking/definition/TCP-IP
Intraship and Internal Communications 143
Doi: 10.1201/9781003377900-9
Cybersecurity and Ship Sensors 145
• VDRs are targets for attack as they record data from multiple sensors,
including speed, direction, position and radar images in addition to audio
recordings and other data that are indispensable to help identify the cause
of maritime incidents. One incident was reported in India where VDR
data from a cargo ship that hit a fishing vessel was erased after an infected
USB stick was inserted to retrieve the data.17 An analysis of another VDR
cited in the same source also detected weak encryption, an insecure mech-
anism for user authentication and various service vulnerabilities.18 This
could lead to modifying data to, for example, delete audio conversations
from the bridge, delete radar images, or alter speed or position readings,
and even spy on a vessel’s crew as VDRs are directly connected to micro-
phones located on the bridge.19
• Video cameras are appearing on many ships as an essential means to
monitor engine rooms, cargo holds, closed spaces and other locations that
are generally unoccupied and sensitive areas where security is an issue.
Many of these cameras are connected to networks and are IoT devices
containing vulnerabilities that could let attackers watch live camera
feeds, create botnets, or use hacked devices as a stepping stone to further
attacks.20 Another problem is that many cameras used on tankers, grain
carriers and other potentially hazardous surroundings do not meet safety
requirements for use in explosive environments.21 Such characteristics can
be exploited to cause abnormal operations that may overheat the camera
and cause an explosion.
• Satellite communications have traditionally been a high-value target for
attack. Weak encryption and old IT equipment pose key vulnerabilities.22
Also, if an individual IoT device is not encrypted or every stage of data
communication is not properly protected, an entire network of connected
devices can be manipulated. Several incidents identified in this paragraph
originated through the interception and manipulation of satellite commu-
nications. Key components to satellite security are the ground networks
that move data and links that transmit and receive satellite data through
which back door assaults can take place.23 Satellites are not useful if their
access and performance are not assured. This has been a significant tac-
tical target for both sides fighting the war in Ukraine.
• Maritime Very Small Aperture Terminals (VSATs) used for two-way real-
time data communication via satellites present cyber threats to vessels due
to the value of the data they transmit and their role as attack vectors by
providing access to vessel IT and OT infrastructure.24 Cyberattacks on
vessels can be executed by discovering and exploiting vulnerabilities in a
sequential manner by first penetrating telecom equipment and taking con-
trol of the IT segment, finding problems in segmentation and accessing the
OT layer and controlling OT equipment and arbitrarily intercepting and
modifying Transmission Control Protocol (TCP) sessions enabling man-
in-the-middle and denial of service attacks against vessels at sea.
152 Cybersecurity and Ship Sensors
port machinery such as gantries, cranes and bridges. Another major con-
tributor to the problem is access to the ship by technicians, maintainers and
port employees. Nefarious interference with port operations, systems, appar-
atus and devices comprising port infrastructure can result in death or injury
to port employees, ships’ crews and passengers; destruction of property and
cargo, loss or compromise of sensitive data, extended cargo delivery sched-
ules and/or environmental hazards. This is aggravated as a result of attack
strategies that can evade the present generation, less than real-time detection
capabilities of disruption to landside and shipboard facilities. The problem
extends to port state control as implemented through port control systems
(PCS), cargo control systems (CCS), and related means to connect multiple
systems across different government, public and private organizations for the
secure and intelligent exchange of information.
Cybersecurity must not only focus on ship infrastructure, but also on port
of origin and destination segments of the entire shipping process. Figure 9.1
illustrates these segments and the additional shoreside components and pro-
cesses that can be vulnerable to cyberattacks through physical and electronic
methods.
This cybersecurity environment is very complex and consists of mul-
tiple segments related to logistics, engineering, security and management
elements representative of cargo transported, compliance, safety and other
functions. This also includes physical port and vessel facilities in terms of
cyberinfrastructure and sensor-based systems through which maintenance is
performed, security is sustained along with many other operations, and upon
which ships rely to safely and efficiently transport cargo between ports. By
focusing specifically on only the ports and ships themselves, rather than the
entire shipper-to-consignee process, including handling, haulage and ware-
housing; the problem is reduced to a more manageable size and complexity.
The physical interfaces between the port and the ship include wire, fiber-
optic cabling and hoses to provide electrical power, communications and
utility services in addition to gangways and ladders across which are trans-
ported physical sensors, new and replacement parts and systems, and various
media to update network and sensor systems by technicians and maintainers.
Figure 9.1 Port and Ship Infrastructure Segments of the Entire Shipping Process.
Cybersecurity and Ship Sensors 155
This also includes any devices and equipment brought onboard by crewmem-
bers and passengers, and devices that may be embedded into cargo and/or
their containers.
Digital interfaces while the ship is in and close to the port include Wide
Area Network (WAN) and wireless communications provided by the port
itself and its immediate vicinity by the local community. In addition, all ship-
board capabilities in terms of satellite, radio and terrestrial cellular commu-
nications continue to be available. Significant changes occur in both the port
and ship cybersecurity environment when shoreside connections are slipped
and the ship leaves port, and when a ship arrives in port and shoreside con-
nections are again established. The only remaining shipboard capabilities
that should exist can be limited to satellite, radio and terrestrial cellular com-
munications, as well as cellular devices in the personal possession of crew-
members and passengers. However, there are exceptions.
While onboard sensors are operational, they can be subject to a wide
variety of external stimuli that provide unconventional and unintended,
two-way access to the ship’s digital infrastructure. These sensors include
cameras, weather instruments and many other devices accessible via light,
laser, acoustic, microwave, radio frequency (RF) sources and other means
of stimulus that may contain data, information and intelligence embedded
within their transmissions. All such means of access must be considered as
part of the infrastructure.
surface and it is always knowable. Firewalls and other controls are moved as
close as possible to the protect surface rather than the perimeter at the attack
surface, where it is decidedly further away from what needs to be protected.
This is much like identifying the protect surface as a person’s lungs to place a
“firewall” to battle lung cancer rather than defining the attack surface as the
skin at the body’s perimeter to keep the cancer from entering the body itself
(probably a bad analogy, but I am not a medical doctor so I will use this as
my excuse!). In this way, it is possible to determine what traffic moves in and
out by a very small number of users or resources that actually need access to
sensitive data or assets. As with the attack surface, organizations must con-
stantly monitor their protect surface to identify and block potential threats
as quickly as possible. However, in theory, the smaller footprint makes this
process more manageable.
Actual methods to model the protect surface are still being developed.
With implementation of distinct DoD Zero Trust capabilities and activities
anticipated by 2027, there remains little time to determine how this will actu-
ally be accomplished.
passengers and cargo from port to port wholly within the confines of the hull,
deck and superstructure. However, unique to the ship architecture are com-
puter systems to support sensor activity that should be segmented to isolate
these functions from general IT operations.
Activities are undertaken through which data, information, algorithms
and software updates pass to and from company IT offices, ports, govern-
ment offices, regulatory agencies and the ship. These activities include freight
and customs, maintenance data and diagnostics, navigation information,
ship sensor data, software and software patches, middleware, and firmware.
ship and its mission(s). This is discussed in the paragraph for Port Digital
Infrastructure.
However, ship OT digital infrastructure differs greatly from those of ports
in terms of functionality and data content unique to support the needs of
physical components serving engineering, deck, passenger, cargo and other
relevant operations. This varies widely depending on the type of ship involved
and its function.
infrastructure also extends to the cargo containers outfitted with smart sen-
sors monitoring their contents and movement of the containers themselves,
and to smart sensors placed within and monitoring the conditions of bulk
cargos, including programmable logic controllers (PLCs) and arrays (PLAs)
integral to remote control and smart sensor designs. Also considered are the
systems used to process the cargo, passengers and crews, maintenance, data
acquisition, processing and analytics across various commercial organiza-
tions and governmental agencies performing both direct and supporting roles
within the port and onboard ships.
Their configurations may be characterized in terms of physical assets, the
functions they perform, how they are interconnected, the means by which
they are accessed and other pertinent details. This includes noting specific
details by equipment types and models, capabilities, accessible ports and the
means used for access and update, including human access using physical
(laptop, mobile devices, USB stick) and electronic (LAN, WAN, Bluetooth,
etc.) methods.
International Association of Ports and Harbors (IAPH) –In 2021, IAPH pub-
lished its Cybersecurity Guidelines for Ports and Port Facilities to provide the
international port industry with a set of cybersecurity guidelines based on
successes achieved by ports and port facilities from around the world.42 They
are designed to assist executives in the port industry to foster greater collab-
oration within their organizations, as well as more broadly with their local,
regional, national and international partners and stakeholders.
United States Coast Guard –The U.S. Coast Guard’s (CGCYBER) Maritime
Cyber Readiness Branch (MCRB) supports the cybersecurity mission in
the commercial maritime transportation community.43 They provide many
resources that can assist in developing, maintaining and updating strategic
planning and cyber policy; assessing threats, vulnerabilities and impact of
loss to the Maritime Transportation System (MTS), sustaining strong part-
nerships with key MTS stakeholders to develop insight into future MTS
Cybersecurity and Ship Sensors 163
decisions and programs are consistent with the President’s stated goals. The
NSTC prepares research and development strategies that are coordinated
across Federal agencies aimed at accomplishing multiple national goals.
The 2019 Federal Cybersecurity Research and Development Strategic Plan
aims to coordinate and guide U.S. Federally funded R&D in cybersecurity,
including the development of consensus-based standards and best practices.45
The Plan identifies four interrelated defensive capabilities (deter, protect,
detect and respond) and six priority areas for cybersecurity R&D (artificial
intelligence, quantum information science, trustworthy distributed digital
infrastructure, privacy, secure hardware and software, and education and
workforce development) as the focusing structure for Federal cybersecurity
R&D activities and investments to benefit the Nation.
Signed into law in December 2022, the Quantum Computing Cybersecurity
Preparedness Act represents major legislation in the area of cybersecurity that
encourages the U.S. Federal government to adopt technology that is protected
from decryption by quantum computing.46 This act addresses the migration
of executive agency information technology systems to post-quantum crypt-
ography, where encryption is strong enough to resist attacks from quantum
computers developed in the future. After the National Institutes of Standards
and Technology (NIST) has issued post-quantum cryptography standards,
guidance will be issued requiring each executive agency to develop a plan to
migrate the information technology of the agency to post-quantum cryptog-
raphy. Also included will be a strategy to address the risk posed by IT vulner-
abilities to weakened encryption due to the potential and possible capability
of a quantum computer to breach such encryption and the development of
standards for post-quantum cryptography. Further information on quantum
sensing is described in greater detail in Chapter 11, Next Generation Sensing.
References
1 Vanguard. Maritime Cyber Attacks Increase by 900% in Three Years. 29 July
2020. Vanguard Media Limited, Nigeria. www.vanguardngr.com/2020/07/
maritime-cyber-attacks-increase-by-900-in-three-years/
2 MarineLink. Cyber Attacks on the Rise at US Ports and Terminals. 5 October
2022. www.marinelink.com/news/cyber-attacks-rise-us-ports-terminals-499964
3 Jones Walker. 2022 Ports and Terminals Cybersecurity Survey. Jones Walker
LLP. www.joneswalker.com
4 Alejandro Mayorkas, Secretary of the U.S. Department of Homeland Security.
Testimony before the U.S. Senate Committee Hearing Channels on Threats to
the Homeland. 17 November 2022.
5 David Kushner. The real story of stuxnet. IEEE Spectrum. 26 February 2013.
https://spectrum.ieee.org/the-real-story-of-stuxnet
6 Deborah Haynes. Iran’s Secret Cyber Files. Sky News. 27 July 2021. https://
news.sky.com/story/irans-secret-cyber-files-on-how-cargo-ships-and-petrol-
stations-could-be-attacked-12364871
Cybersecurity and Ship Sensors 165
7 Kuo-Hui Yeh, Wenli Shang, Tianyu Gong, Chunyu Chen, Jing Hou, and Peng
Zeng. (2019). Information security risk assessment method for ship control
system based on fuzzy sets and attack trees. Security and Communication
Networks, Article ID 3574675. https://doi.org/10.1155/2019/3574675
8 G. Kavallieratos, S. Katsikas, and V. Gkioulos. (2019). Cyber-attacks against
the autonomous ship. In: Katsikas, S., et al. Computer Security. SECPRE
CyberICPS 2018. Lecture Notes in Computer Science, vol. 11387. Springer,
Cham. https://doi.org/10.1007/978-3-030-12786-2_2
9 World Maritime News. Nightmare Scenario: Ship Critical Systems Easy Target
for Hackers, December 2017. https://worldmaritimenews.com/archives/238
869/nightmare-scenario-ship-critical-systems-easy-target-for-hackers
10 B. Svilicic, J. Kamahara, M. Rooks, and T. Yano. Maritime cyber risk man-
agement: An experimental ship assessment. Journal of Navigation, 72(5),
1108–1120.
11 Offshore Energy. Nightmare Scenario: Ship Critical Systems Easy Target for
Hackers. 21 December 2017. www.offshore-energy.biz/nightmare-scenario-
ship-critical-systems-easy-target-for-hackers/
12 Bartlomiej Hyra. M.Sc. Thesis, Master of Science in Engineering. DTU Compute
Department of Applied Mathematics and Computer Science, Technical
University of Denmark, Kongens Lyngby (2019). https://backend.orbit.dtu.dk/
ws/portalfiles/portal/174011206/190401_Analyzing_the_Attack_Surface_of_
Ships.pdf
13 University of Texas. UT Austin Researchers Successfully Spoof an $80 million
Yacht at Sea. 29 July 2013. https://news.utexas.edu/2013/07/29/ut-austin-rese
archers-successfully-spoof-an-80-million-yacht-at-sea/
14 Defense Aviation. Iran Displays Captured US RQ-170 Sentinel, 2011. www.
defenceaviation.com/iran-displays-captured-us-rq-170-sentinel/
15 Corey D. Ranslem. Secure at Sea: AIS Fraught with Vulnerabilities. 10 January
2020. www.the-triton.com/2020/01/secure-at-sea-ais-fraught-with-vulnerab
ilities/
16 BBC News. Warship Positions Faked Including UK Aircraft Carrier. 2 August
2021. www.bbc.com/news/ technology-58027363
17 Eduard Kovacs. Ship Data Recorders Vulnerable to Hacker Attacks. Security
Week. 11 December 2015. www.securityweek.com/ship-data-recorders-vulnera
ble-hacker-attacks/
18 Sean Gallagher. Hacked at Sea: Researchers Find Ships’ Data Recorders
Vulnerable to Attack. ARS Technica, 10 December 2015. https://arstechnica.
com/information-technology/2015/12/hacked-at-sea-researchers-find-ships-
data-recorders-vulnerable-to-attack/
19 Ruben Santamarta. Maritime Security: Hacking into a Voyage Data Recorder
(VDR). IOActive. 9 December 2015. https://ioactive.com/maritime-security-
hacking-into-a-voyage-data-recorder-vdr/
20 Danny Palmer. Critical IoT Security Camera Vulnerability Allows Attackers to
Remotely Watch Live Video –and Gain Access to Networks. 17 August 2021.
www.zdnet.com/article/critical-iot-security-camera-vulnerability-allows-attack
ers-to-remotely-watch-live-video-and-gain-access-to-networks/
166 Cybersecurity and Ship Sensors
The most reliable means for detecting sensor degradation has traditionally
been well-trained mariners with years of experience who can readily dis-
tinguish between normal operation and anomalies in sensor performance
throughout a wide range of working scenarios. Unfortunately, there is a
shortage of well-trained mariners that is not expected to be resolved soon
making this problem more significant.1 As ship automation continues to
increase and sensor systems gain added complexity, the detection of sen-
sor degradation becomes even more difficult. Further complicating matters is
the introduction of unmanned and autonomous vehicles and their need for
enhanced and extended sensor suites and sensor data fusion technology that
allows them to operate without seafarers onboard. Still, the problem of sen-
sor degradation detection remains unresolved.
Unmanned and autonomous vehicles function as passenger, cargo and
other types of vessels included in the partial list cited in the first chapter of
this book. These same technologies continue to be integrated on the bridges
of conventionally staffed vessels to assist seafarers in the performance of
their duties. The volume of imagery, signal and digital data generated by
sensors in recent years has increased by orders of magnitude to advance the
development of groundbreaking hardware and software leading to new and
improved processes that make better use of these resources. The promises of
innovations in system design and function have only begun to be tapped and
include providing new perspectives from which enhanced situational aware-
ness and safety of navigation may be achieved.
Much research has been performed, and entire industries established to
account for and deal with the failures of sensors, the various failure modes to
which they are susceptible and the many methods and techniques that can be
applied to prevent, detect and even predict failure. However, little has been
said and even less investigated regarding the insidious degradation of sensors
especially over extended periods of time that can cause misleading and decep-
tive indications and shield anomalies. Such occurrences can stymie reliable
and dependable ship operations and reduce overall safety.
Doi: 10.1201/9781003377900-10
Vessel Sensor Degradation 169
nefarious acts by adversaries. Internal sensor system failure modes were con-
sidered where their effects can be ambiguous and hard to distinguish yet are
vital to the development of an effective mitigation and compensation strategy.
Sensor degradation is considered on the basis of the symptoms of its occur-
rence and their effects as may be attributed to changes in signal characteristics
from both external causes and causes internal to the vehicle itself. This is
accomplished through system failure and is based upon findings in the rele-
vant literature as well as experimental results related to failure prognostics,
fault diagnosis and sensor analytics. However, practical aspects of software
design and development, training data set creation and the development of
cases for verification, validation, and testing require the acquisition and use of
very large volumes of data and imagery. These datasets must be obtained in a
relevant operational environment to train machine learning and deep-learning
AI solutions to adequately perform these tasks. Different types of experiments
were performed that provided direct evidence of sensor degradation over
time. Training data sets were created consisting of millions of instances of
discrete signal measurements, characteristics, imagery and metadata for nom-
inal and aberrant sensor performance at increasing degrees of hull fouling,
interference, damage and internal failures. Experiment goals and methods are
specific to each undertaking and described in the paragraphs that follow.
10.3.1 Biofouling
Gradual sensor degradation over time by decreasing sensor efficiency is caused
by biofouling, which is a result of the growth of marine plant and animal life
on underwater transducers. This growth can attenuate Sonar and other sen-
sor signals transmitted from active sensors as well as reflected and passive
signals received by onboard sensors. In addition, hull roughness attributed
to biofouling greatly reduces hull efficiency resulting in reduced speed given
the same engine thrust, greater wear and tear on propulsion and steering
systems, and differences in vehicle handling and control surface response that
can alter variables used to predict INS progressive error. This is particularly
critical for unmanned vehicles expected to be deployed at sea for months at a
time without manned intervention. During such long deployments significant
fouling would be expected to occur. It is necessary to ascertain the effects of
fouling on sensor signals through direct measurement in the Frequency and
Time Domains at the sensor and transducer interface and, where applicable,
determine the symptoms of fouling on Spatial Domain representations of
processed transducer signals.
10.3.2 Interference
Numerous different experiments took place over a span of many months to
simulate and effect actual rapid and gradual sensor degradation in a rele-
vant operational environment due to interference from natural and manmade
174 Vessel Sensor Degradation
sources both innate to the natural environment and resulting from adver-
sarial action. This includes spoofing and denial of service (DoS) attacks on
GPS/GNSS systems themselves, and sensor systems that depend on continu-
ously updated PNT data. Also included in these experiments was the use of
multiple extraneous Radar, Sonar and Lidar signals directed toward normally
operating systems with the intention of jamming, confusing and degrading
these systems outside of normal operating specifications.
Degradation of visual and infrared sensors was also accomplished using
actual environmental conditions known to facilitate degraded performance
that includes various precipitation intensities (light, moderate and heavy)
and types (rain, snow, sleet, hail, etc.). Experiments were performed under
varying visibility from clear and unlimited conditions to zero visibility due
to fog and other causes. The effects of icing, turbulence, condensation, and
other factors, such as a buildup of salt deposits from sea spray that degrade
sensor performance were also considered through natural occurrence and/or
simulation of these conditions. A simulation of GPS/GNSS interference with
INS was performed to eliminate any potential for interference with the navi-
gation of nearby aircraft and vessels.
10.3.3 Damage
Many ways exist to damage sensors in a manner that falls short of complete
failure and results in degraded capability. For external sensors, these include
misalignment and partial blockage of view from being hit with debris as
well as from large wave strikes, groundings and allision that causes phys-
ical damage to sensors and transducers. Also considered is the buildup of
contaminants from the combustion of nearby vessel components, intense
heat from fire or internal overheating caused by lack of ventilation, extreme
cold from severe operational conditions, glancing strikes by laser weapons,
and even the application of chemicals designed to render sensors blind or
otherwise inoperable. For internal sensors, performance degradation can ori-
ginate from failures within the various propulsion, electrical, hydraulic, com-
munication, navigation, and other systems within the vehicle. A subset of
vehicle temperature, pressure, voltage, current, power and other sensors and
measurement characteristics commonly associated with these systems were
selected for demonstration.
model of the system was created and made available for testing over a period
of several months.
10.5.3 Radar
Figure 10.4 illustrates nominal (left) and degraded (center, right) images rep-
resentative of different types of degradation that takes advantage of various
combinations of events, timing and signal characteristics to facilitate detec-
tion. All images provided below are from the Garmin 40 w, 48 nm digital
Radar system. The degraded image in the center illustrates the effect of
an obstruction introduced into the signal propagation path. Small, partial
obstructions through complete blockage of Radar signals was employed for
these experiments with varying time durations.
The degraded image to the right depicts the result of electromagnetic inter-
ference with the heading sensor upon which the Radar system depends for
proper geospatial orientation. The primary effect of this experiment was to
cause the Radar display to gyrate wildly resulting in small to extreme distor-
tion of the displayed imagery. Similar results were obtained with the Furuno
12 kw, 72 nm (analog) Radar using the same sensor degradation techniques.
Both degraded states were detected in addition to several more that are not
illustrated.
10.5.4 Sonar
Much of our effort focused on exploring the different types of degradation
that may be experienced by the various types of Sonar sensors likely to be
Vessel Sensor Degradation 183
Figure 10.6 N ominal and Degraded Fish Finder (top) and Chirp Sonar (bottom)
Images.
Vessel Sensor Degradation 185
Figure 10.7 Nominal and Degraded ForVϋ, LiveVϋ and EchoPilot Navigation Sonar
Sensor Images.
10.5.5 Lidar
The Oster OS2-128 long-range Lidar system produces an image spanning
360° with the vessel centered in the middle. The top image in Figure 10.9
illustrates a nominal operation where there is a natural blind spot on the
navigation light array (outlined in the blue circle, left) directly below the field
186 Vessel Sensor Degradation
of view of the sensor. The imagery in the red circle (right) illustrates the area
that has been degraded in the bottom image as a result of an obstruction
placed in the laser propagation path that causes a shadow effect where detail
is lost.
Vessel Sensor Degradation 187
Figure 10.10
L idar Degradation due to Laser Interference (Top), Heavy Rain
(Bottom, Left) and Insect Landing on Sensor (Bottom, Right).
188 Vessel Sensor Degradation
Figure 10.11 Inertial System Degradation over Time without GPS/G NSS Inputs.
Figure 10.11 illustrates the progression of INS error over time along an
approximately 12-mile track, with INS positioning depicted by the red line
while vessel positioning via GPS is shown by the black dashed line with the
boat icon. The black dashed line track to the right of the vessel track is an
artifact remaining from a previous voyage and should be ignored.
GPS/GNSS signals were disrupted upon voyage commencement. The top-
left image depicts the beginning stage of the voyage at a point approximately
10 minutes from leaving the pier where the INS was still providing accurate
positioning and heading information. A series of turns during this period
appeared to hasten the onset of positioning error while heading informa-
tion remained relatively accurate. The top-right image progressing into the
bottom-left image depicts a steady positioning error that continued relatively
unchanged while navigating a relatively straight line. At a point close to voy-
age termination after approximately 45 minutes, the track shown in the bot-
tom right image shows the INS course has degraded significantly.
Detection of INS degradation was accomplished as a GNSS reception outage
was reported by the system over the NMEA-0183 bus. Secondly, positioning
error was detected through bottom depth and contour tracking that determined
local depths at the vessel position did not match those reported by the INS but
indeed corresponded to a position hundreds of feet away from the vessel.8,9
significant errors in instruments such as Radar and ENC dependent upon the
heading sensor and required recalibration of the ENC display. Events stem-
ming from heading sensor degradation were detected and are discussed under
the corresponding instruments.
• AC Power
• DC Power
• Depth
• Engine
• Environment
• Fuel Flow
• GPS
• Heading
• Navigation
• Pressure
• Rudder
Vessel Sensor Degradation 191
Figure 10.12 Degraded Engineering Sensor Detection via NMEA-2 000 Data Bus.
• Speed
• Tanks
• Temperature
• Time
• Vessel
• Wind
Sensors from the above list were evaluated in the detection of sensor deg-
radation. Several of these are shown in Figure 10.12. Moving from top left
to bottom right, these images denote sensor degradation to various degrees
reportable by the sensors themselves over the NMEA-2000 bus and include
instances where Radar service was lost, low DC voltage levels were detected,
the connection with the Heading Sensor was lost, and satellite weather ser-
vice was no longer available. This means of discerning sensor degradation
and anomalies over the NMEA-2000 bus complements the three-domain
approach (Time, Frequency and Spatial) and can provide greater clarity and
specificity that would enable not only the detection of sensor degradation
but also the identification of what types of degradation exist and possibly the
means to overcome the degradation.
This includes placing a box over the weather sensor, spraying the sensor with
streams of water, applying a heat gun to artificially increase ambient tempera-
ture and subjecting the sensor to high winds from a local source that included
a fan. Figure 10.13 illustrates the effects of a stream of water that resulted in
maximum readings in wind speed and relative humidity, abnormal tempera-
ture readings and other measurements that were uncharacteristic and not
representative of the physical environment. Anomalies were readily detected
related to water, temperature and wind while isolating the sensor from the
environment. However, wind anomalies were not readily discernable from a
calm day without wind and with steady temperatures.
10.5.13 Microphone
Audio sensors are vital to ensure compliance with COLREGs Rule 5 require-
ments to maintain a proper look-out by (sight and) hearing.14 Degradation
may result in a lessening or complete inability to hear sounds such as horns,
whistles, sirens and bells emanating from other vessels as well as aids to
navigation that include buoys and lighthouses. This experiment utilized
microphones built into the six onboard cameras oriented at 60-degree angles
surrounding the vessel. The Camera 1 microphone illustrated in Figure 10.15
was degraded using a covering resulting in lower sensitivity as illustrated by
the reduced and distorted sound recorded on the corresponding waveform.
The degradation illustrated for Camera 1 and similar events directed at the
other microphones were consistently detected with directionality determined.
Figure 10.17 Transmission of Snippets via Email Using High Frequency (HF) Radio.
Control Center (RCC) via low bandwidth communication channel for human
analysis at a RCC as well as machine analysis. Snippets, samples of which
are shown in Figure 10.16, are 160 × 145 pixels in size with each requiring
memory from 62k and 118k of memory, without compression. Compressed
snippets would result in even smaller memory requirements.
This is an example of just one way to present this information in a manner
useful to staff at a remote-control center. Figure 10.17 illustrates an example of
how the communication of snippets was accomplished using high-frequency
(HF) radio communications whereby imagery from three sensors was for-
warded from the research vessel directly to our offices for use by a remote
operator. HF radio was selected to demonstrate this capability over a low-
bandwidth communications channel that can span long distances assuming
that satellite communications were unavailable. Other methods to achieve
the same result using broadband communications are readily available.
10.8.1 Identification
The capability to identify specific causes of sensor degradation requires a sig-
nificant investment in terms of further training using the specific characteris-
tics of degraded sensor signals with one or more of the three domain signal
representations as may be available and useful. The visual camera images
representing the Spatial Domain shown in Figure 10.2 illustrate the occur-
rence of specific causes of degradation that are sufficiently different to enable
positive identification based upon the unique characteristics of each. Specific
degradation classes and types can then be created for training using machine
learning techniques.
A least two different classes of degradation are presented in Figure 10.2.
These can be classed as partial or full degradation based upon the extent of
sensor surface area affected. Partial degradation affecting only part of the
sensor image can also be seen in the center Radar image of Figure 10.4,
the center-left Sonar image of Figure 10.5, both the center and right Sonar
images of Figure 10.6 and the Sonar images on the right side of Figures 10.7
and 10.8; and the Lidar images at the bottom of Figure 10.9 and right side of
Figure 10.10. Full degradation that affects 50% or more of the sensor image
can be seen in many of the other figures. Degradation shown in Figure 10.2
can be attributed to damage, obstruction, obscurant and natural-fog and
natural-rain in the following order as shown in Table 10.2:
The examples cited represent identification of the causes of degradation
based only upon the unique characteristics presented from Spatial Domain
perspectives. This alone may be sufficient to accurately and consistently
identify particular types and sources of degradation. In cases as shown in
Table 10.2; frames 4, 7, 11 and 12 of Figure 10.2; additional insight may
be needed to determine proper classification and type. In such cases unique
characteristics associated with each degradation may also exist within the
Frequency Domain as the properties of each causal agent may filter specific
frequencies of light entering the camera and possibly introduce new frequen-
cies that are not naturally present. These inter-domain correlations can help
to provide greater specificity in identification. Examination of Time Domain
characteristics can help to identify the exact timing degradation began,
whether it occurred quickly or gradually, the extent of degradation, and
10.8.2 Remediation
The capability to overcome the effects of sensor degradation may be
achieved through the use of sensor fusion techniques to derive the contents
and meaning of degraded data provided by one sensor by examining data
from other sensors to arrive at the same or similar conclusions. Such an
approach can provide greater resiliency in decision making by using sen-
sor data obtained from different portions of the electromagnetic spectrum
(Frequency Domain) over the same periods of time (Time Domain) while
providing fixed reference points for comparison between different sensor
data products (Spatial domain). Further, such an approach may provide a
second source for information that is normally gathered from another sen-
sor. Several examples are provided within this paragraph to demonstrate the
potential of this approach to insure greater resilience in sensor degradation
detection performance.
References
1 Jim Myer. NMERPAC to look at mariner shortage. Waterways Journal. 2
September 2022. National Merchant Marine Personnel Advisory Committee.
U.S. Coast Guard. www.waterwaysjournal.net/2022/09/02/nmerpac-to-look-
at-mariner-shortage/
2 R. Glenn Wright. In-Stride Detection of Sensor Degradation. GMATEK, Inc.
Final Report. Revision: A, 28 April 2022. Contract: N6833520G2005. Naval
Sea System Command, Washington, D.C.
3 Wright, R. Glenn. Signals Intelligence Automated Assessment of Test
Capabilities. IEEE Automatic Testing Conference. Washington, DC. September
2018.
4 John Boyd. New Security Technology Detects Malicious Cyberattacks on
Drones, Cars and Robots. 26 February 2019. https://spectrum.ieee.org/new-
security-technology-detects-attacks-on-sensors-controlling-numerous-applicati
ons-including-drones-cars-and-robots
5 Michael Koziol. New Ethernet Cyberattack Crunches Critical Systems. 15
November 2022. https://spectrum.ieee.org/cyberattacks
6 A. Aboulian, et al. (2019) NILM dashboard: A power system monitor for elec-
tromechanical equipment diagnostics. Trans. on Industrial Informatics, 15(3),
1405. DOI: 10.1109/TII.2018.2843770
7 DARPA Assured Autonomy Project. www.darpa.mil/program/assured-auton
omy
8 Fouling Ratings (FR) in Order of Increasing Severity. Naval Ship’s Technical
Manual. Waterborne Underwater Hull Cleaning of Navy Ships. S90860CQ-
STM-010, rev. 5. Chapter 081. Naval Sea Systems Command. 1 October 2006.
Table 081-1-1.14.
9 Wright R. Glenn and M. Baldauf (2016). Hydrographic survey in remote
regions: Using vessels of opportunity equipped with 3-dimensional forward-
looking sonar. Journal of Marine Geodesy, 39(6), 439–357. DOI 10.1080/
01490419.2016.1245266
200 Vessel Sensor Degradation
Doi: 10.1201/9781003377900-11
202 Next-Generation Sensing
Similar efforts were undertaken by the U.S. Navy in 2021 using a small
drone to carry lightweight cargo between vessels to eliminate the need for
helicopter flights for a substantial fraction of its spare parts logistics.6 Their
plan is to use a commercial drone to carry up to 24 pounds of cargo for a
distance of 65 miles, which would be sufficient to airlift about 80% of the
Navy’s critical parts cargoes.
These details are indicative of one aera in which interest in maritime drone
use is flourishing. However, since this is a book on ship sensors, the rest of
this paragraph will focus on drone UAV use as a means to extend sensor
presence beyond the confines of the ship or vehicle to achieve other purposes.
system was able to be controlled to take off the moving vessel and fly out to
sea, successfully locating a dummy in the water before returning to land itself
on a mat attached to the boat’s deck.
Remotely piloted and autonomous UAVs equipped with both visible light
and infrared cameras can play an essential role in speeding up the detection
and rescue of people who have fallen overboard. High-resolution imagery
can provide details of partial human forms in the water that are recog-
nizable by remote UAV pilots as well as artificial intelligence (AI)-based
smart cameras during daylight and low-light conditions in a variety of sea
states. Infrared cameras can supplement visible light cameras to also detect
human forms in the water using their heat signature that provides a sharp
contrast to the generally colder water. One researcher has also developed
and publicly released specialized annotated datasets for training and testing
AI-based detectors for this task. This MOBDrone benchmark is a collection
of more than 125K drone-view images in a marine environment under sev-
eral conditions, including different altitudes, camera shooting angles and
illumination.12
types of stack emissions such as sulfur and carbon dioxide in the English
Channel, the North Sea, the Baltic Sea and the Gulf of Bothnia.
Maximum Detection Depth 85 m (278 ft) 100 m (328 ft) 100 m (328 ft)
Maximum Detection Range (MDR) 1,000 m (3,200 ft) 800 m (2,500 ft) 200 m (656 ft)
Operating Frequency 61 kHz 180 kHz 200 kHz
Vertical Coverage 60 o 180 o 90 o
Horizontal Field of View 60 o, 90 o, or 120 o 360 o 60 o
Maximum Transmit Power <1,500 Wrms 800 Wrms Not specified
Angular Accuracy 1.6 o Not specified Not specified
Roll/P itch Compensation ±20 o Not specified Not specified
Roll/P itch Accuracy 0.5 o Not specified Not specified
Maximum Vessel Speed 25 kts. 10 kts. Not specified
Screen Refresh Rate 0.3–2 s 8s 1–2 s
Advance Warning at MDR @ 10 kts. a 183 s (3.1 min) 139 s (2.3 min) 28 s (0.5 min)
Next-Generation Sensing
Sources: EchoPilot 3D Forward Looking Sonar; www.echopil ot.com/u ser/i mage/f ls3d-b rochu re.pdf; Furuno CH-2 70 Searchlight Sonar; www.fur
uno.com/e n/b usine ss_p rodu ct/p df/m ari ne/c h270.pdf; FarSounder 1000 Navigation Sonar; www.fars ound er.com/f iles/f 31566_3 d-s onar-b roc hure _
3.0.pdf
Notes:
Calculations by author.
a
United States Patent 8,717,847; Blake, 6 May 2014.
b
IMO Resolution MSC.232(82): 2006, Adoption of the Revised Performance Standards for Electronic Chart Display and Information Systems
(ECDIS).
c
IMO Resolution A.1021(26), Code on Alerts and Indicators, 2009.
209
210 Next-Generation Sensing
where logical arguments may be made to establish values for each of these
parameters:
Speed of Vessel (Vs) =10 knots
Screen Refresh Rate (SRR) =2 sec, or as noted
Maximum Detection Range (MDR) =obtained from Table 11.1
Alarm Processing Time (APT) =4 sec
Watchstander Response Time (WRT) =5 sec
1 m/s =1.9438 knots
Although some systems can perform at much higher speeds, a value of 10
knots was selected for the Speed of Vessel (Vs) to provide a common basis
for evaluating the reaction time to FLS system alarms. A value of two seconds
for Screen Refresh Rate (SRR) on the display was assumed based on system
performance specifications. However, SRR for the Furuno CH-270 system
may be as high as eight seconds or more as this system provides general
coverage that extends beyond the area directly ahead of the bow. The value
for Maximum Detection Range (MDR) is obtained from the performance
specifications of the individual FLS units.
Alarm Processing Time (APT) is the speed at which FLS data can be eval-
uated by signal processing and alarm generation algorithms to determine
whether a condition exists that breaches predetermined vessel-specific safety
contours and depths. This includes draft, course, maneuvering capabilities,
lateral clearance margins and other factors pertinent to safety of navigation.
An assumption is made that such criteria for FLS are likely to be similar to
those established by IMO for ECDIS.27 Integral to this factor are time delays
to prevent normal operating conditions from causing false alerts because of
normal transients that may exist in the FLS data.28 This can add an additional
one to several seconds to ensure target persistence, eliminating the gener-
ation of an alarm due to a single occurrence or short duration or transient
target. A value of four seconds was selected based on FLS refresh rates as well
as estimates of processing times for interface and communications systems
handling software that may be required by an Integrated Navigation System.
Watchstander Response Time (WRT) is that needed for the Officer of
the Watch (OOW) to acknowledge an alert and take appropriate corrective
action based on the nature of the alarm. This must take into account time
lags necessary to assess rates of change in processes, such as changing ves-
sel course against targets’ movements.29 The time required for the OOW to
confer with other bridge watchstanders and lookouts and to issue orders to
the helmsman must also be factored into this calculation. Add to this the
Next-Generation Sensing 211
effects of fatigue and various other human factor elements, one can easily see
that this factor is the most subjective and imprecise in the equation. A value
of five seconds was chosen in part based on the author’s direct observations
of bridge practices used on several vessels under similar conditions using
ECDIS and Radar indications.
Using these criteria, the EchoPilot 3D FLS provided approximately ½-
minute warning, the Furuno CH-270 provided approximately 2¼-minute
warning and the FarSounder 1000 provided approximately a 3-minute warn-
ing when considering their maximum detection range. The speed of 10 knots
may appear somewhat slow for most vessels while underway. However,
if a vessel is operating in unknown, poorly charted, or known-hazardous
waters it is prudent to increase safety margins by proceeding at a slower-
than-normal pace.
It should be noted that such advance warning calculations generally pro-
vide “best case” scenarios under ideal conditions and that actual conditions
and response times must be expected to reduce these margins –significantly
in some cases. Actual conditions must also take into consideration both
human and technological factors that can result in major deviations from
these response times. Technological factors can include water turbidity, poor
acoustic reflection qualities of potential HtoN and even growth on the hull
that may reduce FLS sensitivity. Human factors can range widely from dis-
tractions on the bridge, unfamiliarity with the equipment and general lack of
training, proficiency or currency in watchkeeping procedures.
Figure 11.2 F orward-L ooking Sonar Alarm Volume Setting. (image: FarSounder,
Inc.).
Figure 11.4 Ground Track of Costa Concordia, Annotated with Coverage for FLS
Navigation.
Source: Marine Casualties Investigative Body, Cruise Ship Costa Concordia, Report on the
safety technical investigation, p. 61, Ministry of Infrastructures and Transports (Italy).
bottom depth contour directly ahead and to the port of the centerline. It is
possible that early indications of an upslope bottom may have been able to
be detected. Note that during this segment, the rate of turn is approximately
11 degrees per minute.
also changes from mud to solid rock, resulting in a large acoustic reflection
and indications of strong targets.
The appearance of a solid wall leading up to the surface and continuously
decreasing in distance would have loomed prominently on the FLS until it
consumed two-thirds of the display from far port to starboard of center.
Despite orders from the master to turn from 330° to 350° throughout the
1-minute segment duration, the rate of turn appears to have been half that at
10 degrees per minute.
equipment would have also been activated, further reinforcing the severity
of the situation. Consistently, at all noted positions on the final approach
(Figure 11.4: positions A through D), the FLS would have indicated clear
water was present off the starboard bow.
It is assumed that the 1.8 minutes prior warning would have provided
sufficient advance notice to plan and execute evasive maneuvers that may
have lessened the severity of the grounding or averted it entirely. Halting
the forward momentum of the vessel would not have been possible as this
would require around 1,300 meters, with the vessel moving at 16 knots, and
this distance was not available.34 However, slowing the vessel combined with
executing a hard turn to starboard upon receiving the warning commen-
cing approximately 800 meters prior to Scole Rocks would have significantly
reduced the damage incurred in the event of grounding such that the vessel
may have remained afloat and lives may have been saved. Indeed, and even
hypothetical, the accident may not have happened at all.
11.2.8 Acknowledgements
Some of the results and parts of the investigations presented were performed
under the European Interreg IVb- project –ACCSEAS Accessibility for
Shipping, Efficiency Advantages and Sustainability. Portions of this paper
were originally published by Michael Baldauf and R. Glenn Wright at the
10th International Symposium on Integrated Ship’s Information Systems
(ISIS 2014), German Inst. of Navigation, Deutsche Gesellschaft für Ortung
und Navigation e. V. (DGON), Hamburg, Germany, 5 September 2014.
• Visual piloting using visible light and infrared cameras to detect and iden-
tify physical aids to navigation present at the sea surface, including buoys,
lights and ranges plus landmarks and terrain features.
• Virtual AtoN (VAtoN) georeferenced to the seabed environment as way-
points along unmarked routes.
pitch and yaw. Also considered can be internal sensors and external meteoro-
logical and oceanographic (METOC) sensor data content and availability to
derive error correction vectors that compensate for progressive error exhib-
ited by INS resulting from a lack of positional references through GNSS or
known physical location. This capability utilizes software developed through
machine-learning techniques to model natural forces and tailor their effects
on performance to the unique characteristics of a specific vessel. Deep-
learning AI can also be used to discern unknown factors, trends and events
present in the data that may also influence vessel performance.
The combination of such techniques can provide a means for a navigation
system to constantly learn new things about the environment and the vessel
itself and to improve its own operational capabilities on an ongoing basis. All
such methods and capabilities listed above are possible today using existing
computing and AI technology. Each method is subject to its own limitations
and accuracy characteristics. However, the fusion of multiple sources of com-
plementary data can reinforce conclusions reached regarding positioning and
achieve accuracies greater than those of individual sources.
An autonomous vehicle would not have the constraints of human senses
and attributes but, without a human in the loop, would lack the knowledge
and decision-making necessary to properly execute safe passage. Machine
learning and optical character recognition can be used to train an automaton
to recognize basic shapes and colors of standard buoys and other AtoN, and
to read identifying numbers and letters that would enable positive identifica-
tion of individual navigation aids. An automated process is used to correlate
with ECDIS and the Light List along with performing cross-checking with
Radar and GNSS indications to verify that the detected aid is watching prop-
erly in the correct position and exhibiting proper characteristics. However,
less than ideal orientation of buoy numbers and letters, fouling by birds, low
light conditions and other natural and man-made phenomena can drastically
obscure, alter physical and light characteristics and reduce the certainty of
what is being seen. Other charted AtoN, such as lighthouses, piers, water
towers, building and landscape features are much less standardized, but a
mariner would routinely detect and identify can be much more challenging
or outright impossible for automated processes.
225
226 Next-Generation Sensing
This approach formed the basis to implement truly Virtual AtoN (VAtoN)
that do not require a corresponding VHF radio transmitter but are georefer-
enced to the physical seabed environment as waypoints. Course and heading
information developed by the navigation computer using multi-sensor data
fusion from cameras, Radar, Sonar and other sensors are provided directly to
the helmsman and autopilot to execute the voyage plan.
The findings of this experiment note the significant contribution of depth
contour following and the use of VAtoN depicted as red circles to enhance
adherence to the planned route over the initial portion of the experiment
course where physical AtoN in the form of buoys are sparse with great dis-
tances between buoys that are essentially beyond camera range. Further along
the course shown in Figure 11.6 where the channel was well marked with
physical AtoN, the use of depth contour following using VAtoN was gener-
ally effective in maintaining a proper course within the channel. Note that
the illustrations provided are shown using raster chart depictions for clarity.
This experiment was performed using vector charts and supplemented with
live soundings obtained using the FarSounder ARGOS 350 forward-looking
navigation Sonar (FLS) to provide more accurate data in much greater detail
than is available using vector charts.
These are just a few of the many areas of research that can be applied dir-
ectly to the maritime environment and, specifically, to ships and unmanned
vehicles or all types and kinds.
Next-Generation Sensing 231
These are just a few of the principles of quantum mechanics that are used
in quantum sensors. As quantum sensing technology continues to develop, it
is likely that even more ways will be discovered to use quantum mechanics
and to achieve even higher sensitivity and precision in sensors.
Congratulations! Understanding these basic principles advances you
to quantum engineer (second class). However, this is as far as you can go
because the rest of the curriculum has yet to be invented.
232 Next-Generation Sensing
Notes
1 In this paragraph (11.2) INS is used for Integrated Navigation System, NOT
Inertial Navigation System.
2 In this paragraph (11.3) INS is used for Inertial Navigation System, NOT
Integrated Navigation System.
References
1 Drone market outlook in 2023: Industry growth trends, market stats and fore-
cast. Insider Intelligence. 7 January 2023. www.insiderintelligence.com/insig
hts/drone-industry-analysis-market-trends-growth-forecasts/
2 Matthew Spaniol. (2020). Drones on Ships. Periscope. 10.13140/
RG.2.2.18715.28969. www.researchgate.net/publication/347521511_Drones_
on_ships
3 Aishwarya Lakshmi. Maersk to Use Drones to Resupply Its Fleet of Tankers. 6
June 2016. MarineLink. www.marinelink.com/news/resupply-tankers-UAS410
695.aspx
4 Gary. How Drones Are Changing the Maritime Industry. 18 May 2016. Ship
Technology. www.ship-technology.com/features/featurehow-drones-are-chang
ing-the-maritime-industry-4865807/
5 Wilhelmsen and Airbus Trial Drone Deliveries to Singapore Anchorage. 16
March 2019. The Maritime Executive. https://maritime-executive.com/article/
wilhelmsen-and-airbus-trial-drone-deliveries-to-singapore-anchorage
6 Carrier USS Ford Tests Out Air Cargo Drone. 8 August 2021. The
Maritime Executive. www.maritime-executive.com/article/video-carrier-
uss-ford-tests-out-air-cargo-drone
7 www.cruisejunkie.com/Overboard.html
8 Marine Accident Recommendations and Statistics, 2015– 2022. Marine
Accident Investigation Branch (MAIB), United Kingdom.
Next-Generation Sensing 233
9 A.S. Selmy, (2016). The Need of Man Overboard (MOB) Detecting and
Tracking System Descriptive Analyses. Conference: The 9th China International
Rescue & Salvage Conference, China.
10 Orhan Gönel and Ismail Çiçek. ÖNEL Statistical Analysis of Man Overboard
(MOB) Incidents. December 2020. www.researchgate.net/publication/
348266442_STATISTICAL_ANALYSIS_OF_MAN_OVER_BOARD_MOB_
INCIDENTS
11 Martin Manaranche. Royal Navy Tests Drones in Man Overboard Trials.
Naval News, 06 July 2021. www.navalnews.com/naval-news/2021/07/royal-
navy-tests-drones-in-man-overboard-trials/
12 Donato Cafarelli, et al. MOBDrone: A Drone Video Dataset for Man OverBoard
Rescue. 15 March 2022. 22st International Conference on Image Analysis
and Processing. ICIAP 2021.arXiv:2203.07973. https://doi.org/10.48550/
arXiv.2203.0797
13 Mike Ball. Antarctic Supply Ship Uses Drone to Navigate Sea Ice. Unmanned
Systems Technology , 23 December 2015. www.unmannedsystemstechnology.
com/2015/12/antarctic-supply-ship-uses-drone-to-navigate-sea-ice/
14 Royal Dutch Shell Scope of Work Narrative. Anchor Retrieval Project
(Location: Beaufort and Chukchi Seas). Fairweather, LLC. Anchorage, Alaska.
USA. 1 August 2023.
15 Drone Guided Ship through Arctic Sea Ice. 16 November 2016. iHLS. https://
i-hls.com/archives/72848
16 Matt McFarland. Ship relies on drone to avoid ice blocks in Arctic waters.
CNN Business. 5 November 2016. https://money.cnn.com/2016/11/05/technol
ogy/arctic-drone-ship-navigate/index.html
17 Drone Survey of ONE Apus Container Collapse. 16 December 2020. The
Maritime Executive. www.maritime-executive.com/article/video-survey-of-
one-apus-container-collapse-made-public-by-wk-webster
18 Gary. How drones are changing the maritime industry. 18 May 2016. Ship
Technology. www.ship-technology.com/features/featurehow-drones-are-chang
ing-the-maritime-industry-4865807/
19 International Regulations for Preventing Collisions at Sea 1972 (COLREGS),
Rule 5: Look-out.
20 Standards of Training, Certification and Watchkeeping (STCW) Convention
and Code, Chapter VIII, section A-VIII/2, part 3 –Watchkeeping at Sea.
21 Samuel Halpern, An Objective Forensic Analysis of the Collision Between
Stockholm and Andrea Doria, www.titanicology.com/AndreaDoria/Stockh
olm-Andrea_Doria_Collision_Analysis.pdf
22 Resolution A.918(22), IMO Standard Marine Communication Phrases, p. 67,
B1/1.3.
23 United States Patent 5,390,152; Boucher, et al.; 14 February 1995.
24 United States Patent 5,675,552; Hicks, et al.; 7 October 1997.
25 United States Patent 7,173,879; Zimmerman, et al.; 6 February 2007.
26 United States Patent 8,717,847; Blake, 6 May 2014.
27 IMO Resolution MSC.232(82): 2006, Adoption of the Revised Performance
Standards for Electronic Chart Display and Information Systems (ECDIS).
28 IMO Resolution A.1021(26), Code on Alerts and Indicators, 2009.
234 Next-Generation Sensing
International Association of Ports and latency 30, 42, 103, 131, 133, 226
Harbors see IAPH laterite 57
International Code of Signals see ISC L-band 6, 34, 38, 39, 57, 103
International Convention for Safe leaked fuel 65
Containers see CSC leaked oil 65
International Convention for the Safety Leda Maersk 217
of Life at Sea see SOLAS LEO 102–4, 115, 126, 220
International Convention on Standards level sensing 12, 157
of Training, Certification and LF 6, 43, 44, 76
Watchkeeping for Seafarers see STCW Lidar 7, 28, 37, 39, 50, 87, 88, 94, 171,
International Electrotechnical 172, 174–6, 178, 185–8, 196, 197,
Commission see IEC 199, 230
International Maritime Organization life jackets 73
see IMO light flashes 123
International Maritime Solid Bulk Light Imaging Detection and Ranging
Cargoes see IMSBC see Lidar
International Organization of Masters, light rhythm 118
Mates and Pilots see IOMM&P light signals 17, 117, 121, 122
International Standards Organization liquefaction 52, 57, 85, 158
see ISO liquid natural gas see LNG
Internet 23, 30, 48, 56, 81, 84, 102, lithium-ion batteries 21, 52, 63, 64
103, 129, 134, 137, 138, 144, 159 load monitoring 12, 157
Internet of Things see IoT local area network see LAN
interoperability 92, 94, 133, 147, 148 log book 46
IoT 23, 56, 144, 151–3, 155, 159, long-range identification and tracking
160, 165 see LRIT
IP camera 138, 153 long-range radio navigation see LORAN
IR 7, 42 LORAN 29, 33
Iridium 36, 102, 104, 126 Loran-C 43, 44
iris scanners 152 loss of resolution 175
IRNSS 34 low bandwidth communication
iron ore fines 57 194, 219
ISC 118 low-bandwidth data links 170
ISO 55, 80, 82, 137–9, 143 low earth orbit see LEO
ISO 11898 139, 143 lower explosion limit see LEL
ISO/IEC 7498-1 139, 143 low frequency see LF
IT 94, 102, 146, 147, 151, 157–9, 161, low-light video cameras 178
162, 164 L, S, C, X, Ku and Ka-bands 103
lubricant oil 15
James Webb Space Telescope 3, 9
jamming 35, 174 machine learning 28, 41, 60, 74, 77, 79,
86, 89–93, 97, 171, 173, 175–7, 196,
Kaami 217 218, 220, 221, 223, 224
Kea Trader 217 macrofouling 180
Ku-band 6, 103 Maersk Garonne 217
Maersk Honam 58
LAN 6, 24, 129, 134, 135, 137, 138, Maersk Tankers 202
143, 150, 154, 158, 160 magnetic compass 46
large wave strikes 174 magnetic resonance imaging 3
laser strikes 175 magnetometer 56, 170
laser weapons 174 MAIB 59, 68, 203, 232
Lash Fire 63, 69 man overboard 34, 80, 202, 203
Index 241