AR Visualization Methods for Subsurface Utilities Rev3
AR Visualization Methods for Subsurface Utilities Rev3
1 Department of Infrastructure Engineering, The University of Melbourne, Parkville Victoria 3010, Australia;
Abstract: Subsurface utilities are important assets that need to be perceived during any construction activities.
Positioning and visualizing the subsurface utilities before the construction work starts has significant benefits for the
effective management of construction projects. Augmented Reality (AR) is a promising technology for the visualization
of subsurface utilities. The aim of this paper is to provide a comprehensive review of the state-of-the-art in AR
visualization of subsurface utilities, including existing AR visualization methods, categorization of the methods and
their drawbacks, comprehensive discussion on the challenges, research gaps and potential solutions. The paper begins
with an introduction of current practice of locating subsurface utilities and an overview of different reality technologies
including AR. We propose a taxonomy of AR visualization methods including X-Ray view, transparent view, shadow
view, topo view, image rendering and cross-section view. We provide a comparison of existing methods in terms of
quality of depth perception, occlusion of real world, complexity of visualization and parallax effect followed by a
discussion of the drawbacks in these methods. Poor depth perception, poor positional accuracy in Global Navigation
Satellite System (GNSS) deprived or indoor area and the parallax effect caused by the user movement are identified as
the main challenge and topics of future research in effective visualization of subsurface utilities.
Keywords: Subsurface utility, visualization, augmented reality, depth perception, parallax effect.
1. Introduction
Locating subsurface utilities such as pipes buried under the ground surface or wires within walls, is a
crucial step in renovation and construction works as such activities can potentially alter or damage any
existing subsurface utilities in the construction site. Despite the importance of locating existing subsurface
services, the process is still greatly based on using paper (PDF) plans in the current practice. There are
various methods for locating indoor and outdoor sub-surface utilities including Ground Penetrating Radar
(GPR), Electromagnetic Locators for outdoors and wire tracers for indoors. Depending on the situation,
utility locators use these devices to locate subsurface utilities and mark the location on the surface.
However, when the exposing begins, the marks will be lost and locating the subsurface services needs to
be repeated.
There are several challenges in accurately locating outdoor subsurface utilities using utility locating
devices. All those locators come with their own limitations and their results can be restricted by several
factors such as the geological condition of the site, characteristics of the surveyed utilities, accessibility of
the surveyed utility, utility density in the surveyed area, experience of the operator, etc. [5]. For instance,
GPR transmits microwave pulses into the ground and measures the incoming reflections which can be
highly affected by the characteristics of the ground. Conductive/clay and rocky soils limit the reflection of
microwave pulses due to their nature of signal scattering [5-14]. Moreover, the locating process can be time
consuming, expensive, and complex to mark on the surface depending on the number of the utilities that
need to be located. Dial Before You Dig (DBYD) is a calling service in Australia that connects the clients
1
who require information about underground assets with asset owners to assist with excavation. According
to DBYD, not all the outdoor underground utilities in Australia have accurate 3D location information
(Quality Level A) for excavation purposes. Information of many underground utilities is in a schematic
format and only indented to indicate their presence (Quality Level D) in the work site [15]. Moreover, many
assets owners are not members of DBYD, and the service cannot provide subsurface asset information if
the particular asset owner is not a member of DBYD.
Traditional method of locating subsurface utilities indoors also faces several challenges, for example, using
wire tracers for locating and marking subsurface utilities within walls can be tiring and the positional
certainty of the located utility highly depends on the efficiency of the device sensing strength. It can also
be complicated to locate and mark when there are multiple assets are congested in a small area, for example
locating a specific type of electrical wires where there are multiple types of electrical wires and pipes within
walls or above ceilings.
Lack of consideration for the location of subsurface utilities can result in utility strikes causing damage to
the services during construction activities. Utility strikes incur significant costs depending on the type of
the service being struck, the time spent on repair and the delay in the construction work. Most importantly,
utility strikes pose a risk to the safety of the workers. Inaccurate location of the utility, poor planning and
negligent excavation or exposing techniques have been identified as main causes of utility strikes [16]. To
avoid utility strikes, there is a need for a suitable and effective approach for locating and visualizing the
hidden assets.
AR is a promising technology which has already shown its significance in medical applications [17-19],
tourism [20], automotive [21-23], education [24,25], military [26,27] and gaming industries [28,29]. AR
superimposes digital contents generated by computer software on the real world which can provide an
effective solution for visualizing the hidden subsurface utilities [30,31]. AR can deliver several benefits to
architectural engineering and construction (AEC) industry. Shin, et al. [31] discussed eight work tasks that
AEC could be benefitted from AR, namely layout, excavation, positioning, inspection, coordination,
supervision, commenting and strategizing.
In this survey, we analysed journals, conference papers and books published from 1992 to 2021 that have
discussed the visualization of subsurface objects using AR. The articles that are reviewed in this paper were
acquired via Google Scholar and Scopus search engines by using all possible combinations of following
keywords “Augmented Reality”, “visualization techniques”, “AR”, “subsurface utilities”, “underground
2
utilities”, “hidden objects”, “Mixed Reality”, “MR”, “underground assets” and “subsurface assets”. We
also reviewed relevant articles that cited the above found articles as well as articles that have been cited by
the above found articles. Additionally, we have reviewed magazine articles and commercial products’
guides of latest development in the context of AR and AR visualization of Subsurface utilities. Since the
topic is still being explored in the research community and the industry, the search of the articles continued
during writing of this paper. This paper fully reviewed 28 articles published from 2009 to 2021 and one
article that was published in 1992. We have excluded articles that are not in English.
The remainder of this paper is structured as follows; Section 2 discusses the different types of reality
technologies including AR in general, different aspects to consider when developing an AR application
and section 3 discusses challenges in visualizing subsurface utilities using AR. Section 4 explores the
existing AR visualization methods for subsurface utilities. Section 5 compares the visualization techniques
discussed in Section 4 and discuss their drawbacks in terms of the challenges discussed in section 3. In
Section 6, we further discuss drawbacks in existing AR visualization of subsurface utilities followed by the
conclusion and future works in Section 7.
2. Overview of AR Technology
AR is commonly defined as a useful technique that superimposes computer generated virtual contents with
the real environment to create an informative hybrid environment to aid the sense of sight. But AR can also
be applied to other senses, including hearing, touch, and smell [41]. AR gained popularity with technology
like depth cameras and computer vision available in mostly commonly used mobile phones and tablets.
There are several competing reality technologies that facilitate the visualization of virtual and physical
objects for various applications. These include Virtual Reality (VR), Augmented Virtuality (AV), AR, Mixed
Reality (MR) and Extended Reality (XR). VR creates a user experience that is completely based on virtual
information [42-44] and disconnects the user from the real world. VR is mostly suitable for small indoor
environments and is not recommended for a field worker due to safety consideration in the work
environment. AV introduces slight reality fusion to virtual environment by overlaying real objects on top
of virtual environment when necessary. But the user is still relatively disconnected from the real
environment depending on the number of real objects being projected [45-48]. AR enhances the user
visualization by integrating digital contents with the real world to provide visual guidance to the user
[30,31,41,49-51]. It basically supplies additional information of the real environment which are not visible
to bare eyes in different forms from wearables such as smart glasses to handheld devices such as smart
phones and tablets [52]. MR is a recent development of reality technologies, which not only overlays the
virtual contents on the real world, but also allows interaction between them [53-59]. MR blends the virtual
contents with the virtual build of the real world to facilitate the interaction. For example, in MR, virtual
objects can be placed behind real objects to create a sense of occlusion. As shown in Figure 1, the flying
saucer (virtual object) is occluded behind the ceiling lights (real object). Finally, XR is an emerging
technology that covers VR, AR and MR technologies and is referred to as an umbrella which bring all the
reality technologies under one term [60-64]. For instance, while VR Head Mounted Displays (HMD) are
not capable of providing MR visualizations, and MR HMDs are not capable of providing quality VR
experience, XR HMDs are video see-through devices consisting of cameras that act as virtual eyes with
spatial mapping capability for MR experience but can completely disconnect the user from the real world
for quality VR experience as well. Varjo XR-3 is a perfect example for an XR device which is available in
3
the market [65]. It is worth mentioning that with continuous development of new devices, there are VR
HMDs with camera in front such as HTC Vive or attachable cameras such as ZED Mini (MR Stereo camera)
to a VR HMD for user convenience and MR experience within VR HMDs. Since the use of VR is limited in
the context of visualizing subsurface utilities and MR and XR are emerging technologies which are yet to
be explored for visualizing subsurface utilities, in this paper we focus only on AR technology.
2.1 History of AR
Even though AR is a widely explored topic in recent years, it was first used during the World War 2 when
the British military developed a technology to display the radar information on the planes windshield [50].
But the official beginning of AR can be considered as Sutherland’s work in 1960 followed by his creation of
the first see-through head-mounted AR display in 1968 [66]. Even after the invention of the AR head
mounted display, AR was not a very popular topic until the term “Augmented Reality” was officially
established in 1990 and gained popularity in many industries thenceforwards. Figure 2 shows a summary
of the main development and events in the history of AR starting from 1945 to 2019.
Figure 2. Timeline of AR
4
2.2 AR in AEC
AEC industry is slowly moving into digital transformation with the advancement of digitization,
automation and integration which is also referred to as Construction 4.0. The use of digital data enables the
implementation of AR in AEC industry [67]. There have been numerous research studies have been
conducted on different use cases of AR in AEC [31,67-72]. Albahbah, et al. [70] reviewed seven applications
of AR that can benefit construction industry including safety management, communication and data
acquisition, visualization, construction management education, progress tracking, quality management
and facility management. Alizadehsalehi, et al. [71] developed a framework for automated construction
progress monitoring called DRX, which integrates XR, reality capturing technologies like 3D laser scanning
and wireless sensors, Building Information Model (BIM) and Digital Twin (DT) of actual physical assets,
devices, systems, process, places, and people. Their work also discusses the strengths and the challenges of
the system in terms of cost, quality, and time. Davila Delgado, et al. [68] analyzed six possible applications
of AR in AEC industry namely stakeholder engagement, design support, design review, construction
support, operation and management support and training. Their study also shows that the AR has a long
way to go for achieving its full potential in AEC because of the limitations of presently available AR devices
including comfort and safety, high accuracy tracking, improved indoor localization, dynamic 3D mapping
of changing environments and longer battery life. They also identified challenges in workflow and data
management such as developing data exchange standards, archiving AR contents and integration with
other built environments as well as challenges in new capabilities like real time virtual model
modifications, real-time integration with internet of things (IoT) [72] object and gesture recognition and
real-time physics simulations, predictive and prescriptive analytics. Noghabaei, et al. [69] investigated the
trends in adopting AR/VR technologies in the AEC industry by conducting two user surveys in two
consecutive years (2017, 2018) involving 158 industrial experts. The results show potential for a solid
growth of AR/VR technologies in AEC industry in the next 5 to 10 years and significant rise of use of these
technologies from 2017 to 2018. The survey also highlights some limitations of adopting AR/VR in AEC
industry such as “lack of budget,” “upper management’s lack of understanding of these technologies,” and
“design teams’ lack of knowledge.”
2.3 AR Application
There are several aspects to be well-thought-out before developing an AR application as different domains
have different AR requirements. For example, an AR game application requirement might be different from
those required in AR medial application. In the context of visualizing subsurface utilities, there are number
of aspects that need to be explored when developing an application including accuracy of the data for
generating virtual model to be displayed, Software Development Kit (SDK) and game engines to be used
for developing, AR devices to be used, positioning, anchoring methods of virtual model and localization
methods of the AR device and the nature of the environment where the application is used such as indoor
or outdoor.
5
dimensions. QL-C uses available information from QL-D together with ground features such valves,
hydrants pit lids. QL-B delivers information about underground utilities with an accuracy ranging from
±300 mm horizontal and ± 500 mm vertical. Basically, it provides relative location and depth from another
feature, for instance, a water pipe may be 1 m lower from the ground surface and 2.4 m from the edge of a
foot path. QL-A information is the only validated data with ±50 mm vertical and horizontal accuracy with
additional information of the utility such as owner type, status, material, size, installation date, surveying
methods used to capture etc. [15]. If the AR visualization is to be used for excavation purposes, QL-A is the
recommended data to be used for generating virtual model to be displayed. But the challenge is when the
accurate 3D location information of the subsurface utilities is not available to visualize in AR. On such
occasions, geo-referenced information captured by utility locators need to be integrated with AR system
for accurate location of subsurface utilities
Game engines or software framework are the main software that are used to build 2D and 3D games as
well as AR applications. They provide almost all the supported programs and function needed to develop
an AR application. Unity3D is one of the most popular game engines in the AR industry for a variety of
reasons. First, it is free for small development companies and students. Therefore, there is an active
community using Unity3D for building software and plenty of supports and training materials are
available on the internet. It is a cross-platform game engine that allows to build software for devices in
different platforms like Windows, IOS, Android etc. Most of the SDKs are developed to be used within
Unity3D. Unreal Engine is another example of a free game engine available. It offers similar framework for
developing 3D games and AR and VR applications for a wide range of devices with different OS platforms
[73-75].
2.3.3 AR devices
AR devices can be classified into three main groups, namely video see-through (VST), Optical see-through
(OST) and projective based [73]. Video see-through technique utilizes Hand Held Displays (HHD) such
smart phones, tablets and video see-through HMDs to capture the real environment using the devices’
camera and the virtual objects are superimposed in the video [76]. Mobile phones and tablets are widely
used AR devices due to their public availability and useful suitability for light weight AR applications.
Optical see-through technique utilizes optical see-through HMDs or smart glasses by projecting the virtual
content on a planer screen while the user can see the real environment through the optical lenses to create
the AR scene. Smart glasses can only render 2D virtual contents for instances photos, videos, webpages etc.
Epson Moverio BT-300, Solos, Vue are a few examples for 2D smart glasses in the market. Optical see-
6
through HMDs such as Microsoft HoloLens and Magic leap are capable of processing 3D virtual contents
allowing seamless transition between real environment and virtual model with advanced computational
ability [76]. For example, Microsoft HoloLens Gen 2 consists of multiple environmental understanding
sensors such as depth camera for simultaneous localization and mapping, Inertial Measurement Unit
(IMU), hand, head, and eye tracking sensors, together with Holographic Processing Unit (HPU). Smart
glasses are relatively cheaper, light weight and comfortable while see-through HMDs are expensive and
heavier than smart glasses. AR HMDs are much more capable of performing computational-heavy tasks
for more advanced AR experiences [77]. In addition to that, they allow the user to freely move around
utilizing their hands on performing other tasks rather than holding a mobile phone or tablet. The projective-
based or spatial AR technique requires one or more projectors to project the virtual content on to the real
environment [78]. The spatial AR technique is limited to static applications but is advantageous in terms of
resolution, focus and field of view as compared to video see-through or optical see-through AR [79].
7
Moreover, the depth perception of the virtual model can vary in indoors and outdoors according to the
experiment conducted in [106]. In this study, an AR model with longer lines and different coloured
reference model of known distance was projected using mobile AR to test the users’ depth perception in
both indoor and outdoor. The research found that distances were overestimated in outdoor. The indoor
environment was a narrow walkway with doors on both sides and outdoor was an open car park.
Therefore, an AR visualization method does not necessarily provide the same results in different
environments. This indicates that visualization strategies should be implemented according to the nature
of the environments where the AR application is used.
Many companies have launched software to achieve quality visualization of subsurface utilities using AR
technique. VGIS is one of the companies based in Canada that develops AR application to visualize spatial
data, BIM using variety of handheld devices as well as head-mounted displays. They also develop different
visualization methods for subsurface utilities to easily identify them using different colour codes for
different utilities [1]. Leica geosystems also uses software from VGIS with their GNSS antenna to achieve
higher positional accuracy. Trimble SiteVision launched by Trimble which also provides similar service to
VGIS products. Trimble SiteVision offers subsurface utility visualization called Pit view where a virtual pit
model is formed to visualize underground utilities [2]. Augview is another example of a company that
develops AR GIS products for mobile phones and tablets. Most of the products allow real time interaction
with visualized 3D models and the real environment such as measuring distance, depth as well as letting
user make changes to 3D models.
8
be inconvenient to carry and use. App-based AR solutions require downloading and installation prior to
use and are inconvenient for cross-platform deployment. For instance, an event of a particular AR app may
not be able to use in another AR app. Web AR provides low cost and cross-platform AR solutions using a
web browser without needing any prior installation.
Virtual models for visualizing subsurface utilities in Device-based and App-based AR solutions come in
many standards depending on the requirement of the AR applications installed in the AR device. The most
popular standards are in Computer Aided Design (CAD), BIM and Geographical Information System (GIS).
There are a number of file formats are being used in CAD domain such as Drawing (DWG), Document
Exchange Format (DXF), Object (OBJ), Collaborative Design Activity (COLLADA); Industrial Foundation
Classes (IFC) and Green Building Extensible Markup Language (gbXML) in BIM domain and Geography
Markup Language (GML), City Geography Markup Language (CityGML) and Keyhole Markup Language
(KML) in GIS. Each of these file formats represents varying degree of information of 3D models in terms of
geometry, semantics, Level of Details (LoD), texture, etc, and conversion between them can cause loss of
information [111]. In the absence of 3D models of subsurface utilities, paper plans can be used as an
alternative to provide information to manually generate 3D models in different file formats using various
CAD and GIS software.
Web AR is mostly targeted hand held devices such as mobile phones and tablets and has its own limitations
compared with device-based and app-based AR such as limited computation, visualization capacity, lack
of web AR standardization between different web browsers and limited hardware capabilities of mobile
phones [112,113]. Web AR applications developed can be classified into two categories which are location-
based and non-location/vision-based. Location-based AR is usually designed for large-scale outdoor
applications and non-location/vision based is usually for small-scale indoor application such as tabletop
applications. Location-based AR uses GNSS to track the location of the AR model based on Point of Interest
(POI) which can be presented in a standardized method [114]. POI is a marker that contains its location
information in geographic coordinate system (latitude, longitude). There are popular standards for
location-based AR such as Augmented Reality Markup Language (ARML) 1.0, KARML, Layar JSON etc.
which are specifically targeted for mobile applications [114]. ARML and KARML are based on KML, which
enables direct use of POIs from existing GIS systems. The latest standard for location based AR is ARML
2.0 which was introduced in 2015 by the Open Geospatial Consortium (OGC), one of the Standard
Development Organizations (SDOs) [110]. ARML 2.0 is completely different from its previous version and
supports both location-based and non-location/vision-based AR applications which are based on different
AR anchors such as Geospatial (latitude, longitude), Trackable (QRcodes, images, Objects) and Relative To
(an anchor relative to other objects) [115].
9
standardization from a user experience (UX) perspective and introduced a theoretical framework called
UX for AR (Ux4AR). Ritsos, et al. [117] reviewed multiple questionnaire-based research that can affect user
experience to identify the important elements to be mindful for setting standards for AR application such
as Use case, Health and safety, integrity, privacy, and security as well as context awareness. When
developing AR application to visualize subsurface utilities, context awareness, health and safety elements
may be the most critical elements to be considered. Context awareness demonstrates of being conscious
about the nature of user’s location, how virtual contents are used, placed spatially and temporally. Health
and safety are a matter of the utmost importance. For instance, utilizing AR HMDs for extensive period can
cause fatigue and visual induced motion sickness, side effects of poor ergonomics etc. [118-121]. Similarly,
walking unobservantly of the environment holding a mobile or tablet running AR application could meet
with serious accidents.
10
[36,126]. AR in handheld devices is commonly implemented using a single camera and a flat screen
(monocular) where the binocular-disparity is removed which is critical for depth perception [123,128].
Recent Development of scene understanding HMDs offer better depth perception by displaying
stereoscopic images for both eyes that changes in accordance with the user’s head motion [129]. Therefore,
HMDs works well in terms of binocular disparity for above surface AR scenes with minimum additional
visual cues for virtual models. However, it is very challenging to achieve improved depth perception when
it comes to visualize subsurface objects as the virtual contents are always rendered on top of the real
environment regardless of their depth values. For example, when an underground pipe section is
visualized in AR, the device will always draw the pipe above the ground. The resulting visualization is the
underground pipe appears floating on top of the ground. Figure 3 illustrates how virtual models are
visualized in an AR scene using an HMD when the occlusion function is disabled as it would hide the
subsurface objects once the ground is mapped. As shown in Figure 3, Augmented Pit and the underground
pipe appears in front of the bush (real world object) while both virtual models would look floating on the
ground surface. Therefore, adding further depth cues to virtual subsurface objects is crucial for achieving
better depth perception.
11
Figure 4. Parallax effect of virtual pipe with the user movement in AR
12
4 AR visualization techniques for subsurface utilities
Application of AR for visualizing subsurface utilities has led to many studies
over the past two decades, Yet there are still significant improvements have to be made before AR can
practically be used in subsurface utility works. Different techniques introduced to enhance AR
visualization of subsurface services so far, can be classified into six main categories even though they are
referred to with slightly different names in the past studies. These include X-Ray view, Transparent view,
Shadow view, Topo view, Image rendering, and Cross section view. In the following, we discuss the
principles of the above AR visualization techniques.
Behzadan, et al. [32] also analysed the excavation tool to visualize underground pipes in their research on
utility collision avoidance to locate buried pipes without damaging them during the excavation. They
integrated Real Time Kinematic Global Position System (RTK-GPS) with the available geospatial data to
generate a 3D model and augmented it in real time using a display within the excavator’s cabin. The study
further improved the pose of the projected virtual model at each frame of the video by placing additional
GPS antenna on the cabin to compute the directional vector of the excavator by applying a trigonometric
3D registration for a composite AR view. The research also reviewed the labelling, different colour coding
of services and X-Ray vision/Excavation tool. Similar attempt was made in [135] to avoid collision during
the excavation of underground pipe by augmenting live excavation work using a sandbox and a toy
excavator. Kinect device was used to capture the dynamic terrain topography during the excavation. Côté,
et al. [134] used X-Ray view in his experiment in increasing the level of accuracy of AR Utility virtual model
13
visualization using robotic total station and a tablet. Feiner, et al. [139] discussed the X-Ray View but
referred to it as “Cutaway effect”. They illustrated how a hidden battery inside a radio can be visualized
using X-Ray View (Figure 7). X-Ray view method can deliver additional depth cues by replacing a defined
area of the first depth order object with the occluded object in AR visualization.
Figure 7. X-Ray view of a hidden battery inside a radio, The battery is rendered with cutaway effect [139].
Another form of X-Ray view is called Reality capture, where 3D models of actual utility excavation holes
are captured as point clouds using various scanning methods and they can be visualized in AR later when
needed. As shown in Figure 8 where the pre-captured point cloud of a utility pit is rendered on top of the
footpath [137]. Similar representation of pre-captured point cloud embedded on the real world in AR is
discussed in [140] as well for achieving a realistic view of the past. Since X-Ray view offers good quality
depth perception, it has been discussed as one of the subsurface utilities’ visualization methods in various
aspects of research in AR for subsurface utilities [35,37,133,136].
14
4.2 Transparent View
Transparent view provides a visualization of a virtual object rendered as transparent and overlaid on the
real object. This method allows both the virtual and real objects to preserve only some of their information
where they meet. The amount of the transparency is set by the user. Feiner, et al. [139] explained transparent
view with hidden battery inside a radio by making the battery and the defined area of the radio transparent
(Figure 9).
Figure 9. Transparent view of a hidden battery inside a radio. The battery and defined area of radio are
rendered as transparent [139].
Zollmann, et al. [39] discussed another version of transparent view named alpha-blending, which uses half
transparent objects preserving equal content of virtual and physical environments (Figure 10 (a)). Another
form of transparency is called distance alpha-blending (Figure 10 (b)), in which the amount of the
transparency is proportional to the distance of the virtual object from the user point of view. Thus, the far
virtual objects would have more transparency compared to the closer virtual objects to minimize the
complexity of the AR with many pipelines [35].
(a) (b)
Figure 10. Different Approaches of Transparent view. (a) Alpha-Blending [39] (b) distance alpha-
blending [35]
15
4.3 Topo view
In Topo view the depth of the subsurface utility is not considered, and it is visualized using its surface-
level position. Subsequently, the visualization does not suffer from the parallax effect. Stylianidis, et al. [34]
called Topo view as a street level mode where all the pipes are projected on the street level regardless of
their varying depths as illustrated in Figure 11. Topo view can represent the subsurface assets in a very
simpler manner, which can be problematic when several subsurface assets are located at the same
horizontal position but different depths.
Cote, et al. [141] also proposed to use Topo view in their research on visualizing underground utilities on
to the roads surface and align the model with the real world utilizing a pre-captured 3D mesh of the real
environment. They referred to Topo view as 2D pipe maps in their study. As shown in Figure 12, the
pipes and pits are projected on to the ground level.
Hansen, et al. [140] presented a new approach to replace physical spray marking by virtual spray marking
which also a form of Topo view. As shown in Figure 13, the virtual utility representation replicates the
16
actual ground marking. They utilized an iPad with built in Lidar sensor together with customized sensor
box containing GPS-RTK, smart IMU and pressure sensor. Since real-time reconstruction of the real world
is also performed in their approach, the virtual paint correctly aligns on the ground surface even during
the excavation as well as the dynamic occlusion. Therefore, the virtual model will not be rendered on top
of a moving object on top of the ground and correctly aligns on top of the changing ground surface during
the excavation.
(a) (b)
Figure 13. (a) Physical marking on street surface (b) Virtual utility marking of similar information in AR
[140]
17
(a) (b) (c)
Figure 14. Different visualizations of shadow view. (a) Virtual shadow cast on the ground plan [3], (b)
shadows with different line styles [34], (c) shadow (yellow) and the pipe (green) in different colors [86]
Becher, et al. [33] evaluated the effectiveness of shadow view using virtual cubes which are in different
depths from ground level. This method was referred to as Projection Display (PD) in the study. As shown
in Figure 15 (b), the shadows of the cubes are projected on to the ground plane with connecting lines. Grid
Display (GD) (Figure 15 (c)) and Projection Grid Display (PGD) (Figure 15 (d)) are rendered with additional
ground plane grid to Naïve display (ND) (Figure 15 (a)) and PD respectively. ND is discussed in this
experiment for the sake of comparing the quality of depth perception with other methods.
Figure 15. (a) Naive Display (ND), (b) Projection Display (PD), (c) Grid Display (GD) , (d) Projection Grid
Display (PGD) [33]
18
Figure 16. Image rendering method. Smooth colored mask and the edges of the physical world over the
virtual objects [142]
Zollmann, et al. [39] analysed various image rendering methods to produce better AR for underground
services, such as edge-based ghosting ((Figure 17 a)), image-based ghosting (Figure 17 (b)) on video images.
The edge-based ghosting is a method in which edges of the real-world features replace the virtual models.
The image-based ghosting method preserve only the important information from the image such as bright
pixels and edges for rendering. According to the survey conducted in their studies, the image-ghosting
method outperformed the other methods in terms providing better depth cues.
(a) (b)
Figure 17. Image rendering (a) Edge-based ghosting (b) Image-based ghosting [39]
Eren, et al. [37] also investigated an image rendering method called edge overlay to increase the depth
perception, where the sharp edges of the physical world replaced the virtual and the real contents. They
tested this method by visualizing virtual pipes of a rectangle grated pit (Gray) (Figure 18). In this method
the sharp edges extracted from the image of the real world are replaced with white bright pixels in the
image. It adds a visual cue to assist the user to feel that the virtual model is under the physical environment.
19
Figure 18. Image rendering. Edge overlay. Gray rectangle is the lid of the pit [37]
Kalkofen, et al. [143] analysed image rendering method by showing the edges of the real-world object with
white pixels to add extra depth cues to the hidden object using a toy car (Figure 19). They referred to the
method as Focus and Context where Focus is the important information, subsurface services, and the
Context is visible to the bare eyes which is the real environment.
(a) (b)
Figure 19. (a) Careless overlay of virtual object (Red), which looks floating on the real environment. (b)
After including edges of the actual car, it adds better depth perception of the virtual object (internal parts)
[143]
20
(a) (b)
Figure 20. Different Cross-section view. (a) opaque cross section plane (b) ground plane grid [37]
AR visualization of multiple subsurface utilities at a user defined location in a single display can create
confusion for the user at site. Baek, et al. [144] proposed a solution for this problem by developing a database
of the marker location of the services and a system to automatically generate a cross section view where
needed. The database also enables the user to make correction to the virtual contents. As shown in Figure
21, this provides a simplified and less cluttered way of displaying the location, depth, and materials of
buried services.
Figure 21. Visualization of marker locations of underground services, vertical arrangements, and
attributes of pipes [144]
21
5 Comparison
The visualization techniques discussed above have advantages and drawbacks when applied in practice
for visualizing subsurface utilities. It is important to determine the correct visualization method according
to the requirements of the task. In this section we compare the AR visualization methods in terms of the
quality of depth perception, the amount of occlusion of real world, the complexity of visualization and the
parallax effect. Table 1 summarizes the comparison of AR visualization techniques in terms of the above
criteria.
5.1 Quality of depth perception
Depth perception is important for an effective visualization of subsurface utilities to provide a visual cue
about the depth of buried utilities from surface. Incorrect overlay of the virtual content over the real-world
would create depth perception issues, which will lead to unnatural visualization or wrong perception of
the objects order [32]. When the quality of depth perception is poor, the augmented subsurface model
would appear floating on top of the real environment rather than under the surface. AR methods discussed
in section 4 provide different depth perceptions of different quality. X-Ray view and Cross-section view
provide better depth perception as compared to the other methods. Baek, et al. [144] compared the X-Ray
view with few other methods and conducted a survey to find the most effective method. Their results show
that Excavation box which is a form of X-Ray view outperformed the other methods. Moreover, the survey
conducted in [37] revealed that the Excavation Box (X-Ray view) and the Cross-section techniques
provided a similar quality of depth perception. Reality capture form of X-Ray view is also a great way of
achieving a better depth cues, but it requires pre-captured data which is a major limitation of the method.
Transparent view, shadow view and image rendering methods provide moderate quality of depth
perception. The depth perception achieved in transparent view is less effective than the X-Ray view. This
can be examined by comparing the visualization shown in Figure 7 and Figure 9 where the cutaway method
(form of X-Ray view) shown in Figure 7 provides better depth cues than transparent view shown in Figure
9. Similarly, comparing the visualization in Figure 6 (a) and Figure 14 (a) reveals that shadow view is less
effective in providing depth perception than the X-Ray. The survey conducted in [3] with expert end users
from utility companies shows that the virtual trench along the subsurface utility pipes gives better depth
cues as compared to shadow view of the pipe on the ground. Experiment performed in [33] compares
shadow view (Projection display) with Naïve Display (ND), Grid Display (GP) and Projection Grid Display
(PGD) (Figure 15). Even though the result of the experiment reveals that PGD which is a shadow view
combined with a grid on ground plane outperformed other methods, it is difficult to compare PGD with
other visualization methods reviewed in this paper. Various image rendering methods used in the past
studies such as edge-based, image-based, and predefined mask approaches do assist in accomplishing
better depth cues [39,142]. From the result of the user survey conducted in [39], the image-based ghosting
method (Figure 17 (b)) outperformed the edge-based ghosting method (Figure 17 (a)). The survey
conducted in [37] found that Edge overlay technique (Figure 18) is less effective than the other methods in
terms of depth perception. Comparing Figure 6 and Figure 17 also shows that X-Ray view provides a higher
quality of depth than image rendering.
Topo view simply ignores the depth data and, therefore, it does not give any depth perception of buried
utilities. Topo view may not be the suitable for users who require 3D visualization of the subsurface
utilities, which is necessary when exposing and pin the preliminary planning stage of construction. Even
22
if the depth information could be retrieved at user selected locations, it might not be the most user-friendly
method as the user has to select multiple locations when necessary.
Topo view and shadow view partly occlude the real environment, but they are more effective that X-Ray
view as they occlude a small area. Cross-section view also provides a moderate level of occlusion, but
excessive use of cross sections can occlude more real environment (Figure 20). Comparing image rendering
methods with other AR visualization methods in terms of occlusion of real world is not straightforward as
the result highly depends on the algorithm used to determine the amount of the occlusion of the real
environment.
Transparent view can create a moderate level of complexity based on the transparency level used in the
visualization as it preserves information of both virtual and real worlds (Figure 9). Similarly, Cross-section
view can add complexity to the scene depending on the number of the cross-section planes used for the
visualization (Figure 20 (a)). Image rendering methods cannot be compared in terms of complexity as the
complexity depends on the pixel processing algorithms used. When the algorithm tries to preserve more
information of the virtual and real worlds, it can be complex, but simple image rendering is also possible.
As shown in Figure 17, image-based ghosting method occludes the entire section of the virtual model by
bright pixels in the image whereas edge-based ghosting method occludes the virtual model only where the
sharp edges of the real worlds are. Shadow view could be considered as the most complex method among
the other methods as it visualizes two lines for each utility and make the scene crowded (Figure 14(b)).
23
5.4 Parallax effect
Parallax effect makes it difficult for the user to locate 3D subsurface utilities on a horizontal surface when
the user moves (Figure 4). To achieve an effective visualization, it is very important to mitigate the effect
of parallax in AR visualization for subsurface utilities. Parallax would persist to affect every time when a
3D virtual subsurface object with depth is visualized. Parallax effect is controlled to a certain extent in X-
Ray view even though the utility model is visualized in 3D. When the utility model is visualized inside a
virtual trench or virtual pit the shift of the virtual utility model relative to real world will be negligible as
the model is shown at its true depth. The remainder of the 3D virtual model, however, would suffer from
parallax in X-Ray view. The only method which would not be affected by parallax is the Topo view as it
does not contain depth information. As shown in Figure 11, the pipes are visualized on the ground level,
and when the user moves with the AR device, the shift of the subsurface utility relative to the ground
would be zero. Shadow view is the second-best AR visualization method in terms of parallax. Since the
shadow of the subsurface utilities is like Topo view, the parallax effect is handled to some extent as the
shadow of the subsurface utility would be stable when the user moves. But the subsurface utility has the
parallax effect unlike its shadow (Figure 14(b)). Cross-section method also suffers from parallax except for
top of the cross-section plan and the grid in Figure 20 (b)). The other AR visualization methods suffer from
parallax effects as 3D objects with depth are shifted on the ground surface.
24
6 Discussion
To achieve an effective AR visualization of subsurface utilities indoor/outdoor, a suitable visualization
method that allows the user to achieve correct depth perception with low occlusion of real world, low
complexity and minimized parallax effect is crucial. None of the above discussed AR visualization methods
meets all these requirements when they are implemented alone. Figure 22 and Figure 23 show two
examples of visualization methods used by commercial products from VGIS and Trimble. As depicted in
the figures, the above discussed drawbacks apply for these two visualization methods as well. Figure 22
illustrates a form of shadow view where the virtual underground pipe models (solid blue, red and green
lines) appear floating on top of the road providing poor depth perception. Figure 23 shows an X-Ray view
with a yellow opaque Pit where the underground pipes and pits (blue/red horizontal and vertical models
respectively) are rendered within. Almost half of the real environment information is occluded in the
mobile display from the current point of view which will prevent the user from identifying potential
hazards in the real environment.
Figure 22. VGIS Shadow View Figure 23. Trimble SiteVision Pit View (X-Ray
“with permission” [1] View) “with permission” [2]
Another issue in visualizing 3D subsurface utilities that requires further investigation is the parallax effect
caused by the user movement with the AR device. Liu, et al. [86] tested an approach with laser-based target
designation which extrapolates the location of a hidden pipe (behind a wall) on the surface of the wall to
counteract the parallax issue. This method is like projecting the shadow of the subsurface pipes on to the
surface to provide a better depth perception. User studies conducted in that research, found that, even
though the user is provided with visual cues for understanding the depth, the spatial assessment of the
target position is untrustworthy when the offset between the user and the virtual object increases. In such
a situation the parallax effect cannot be eliminated and as the distance between the virtual model and the
user increases the parallax effect becomes more significant.
An effective AR visualization is achieved when both the real and the virtual models are accurately
registered with each other in real-time, or else the projected virtual model would look out of the true
position or floating on the real scene creating an unnatural experience as shown in Figure 7 and Figure 5
respectively. Accurate registration of virtual content to the real world requires, accurate localization of the
AR device as well as proper understanding of the real-world by the AR application. This problem could be
resolved by seamless registration of the virtual model with an image of the real world in real time. This
approach is called model-based tracking [93,140] and can potentially minimize the misalignment between
the virtual and the real worlds. However, model-based tracking is computationally complex, and its
25
feasibility depends on the processing power of the AR device [101]. Moreover, it requires pre-captured 3D
model of the real environment which is a major limitation [141].
Another major limitation of AR visualization of subsurface utilities is the low accuracy of the virtual model.
Currently the majority of available outdoor utility datasets are still based on paper plans and do not
comprise of accurate horizontal/vertical locations for every situation which is a significant challenge in
outdoor AR visualization of subsurface utilities [15]. In some instances, utilities buried long time ago may
not have any information about their location or depths. On such occasions, the first step is capturing
underground utilities using locating techniques. Once the utilities are located, they could be used to create
virtual models to be used in AR visualization. Alternatively, geo-referenced digital data captured by utility
locating devices can also be used in AR visualization [145]. Ayala Cabrera, et al. [146] developed a method
to fuse AR with underground utility information captured by locating devices such as GPR for Water
Supply System (WSS). Their system enables elements of WSS to be explored and updated which would
help for a dynamic management of WSS.
Further research is needed to develop more effective visualization techniques to overcome the drawbacks
of the current methods for achieving better depth perception when a 3D virtual model with depth is
visualized. Existing AR methods are more focused on improving the quality of the virtual model and not
considering the real world. When AR has spatial awareness, it is possible to achieve a better visualization
allowing the virtual contents to interact with the real world [58]. Therefore, taking the topology of the real
world into account can potentially improve the AR visualization of subsurface utilities [135,140]. Moreover,
a combination of different visualization methods might accomplish an effective AR visualization
depending on the site condition. Future directions for developing more effective AR visualization include:
1. With recent development in AR devices, interaction with the real environment can be more practical.
This brings a new source of information in addition to the virtual contents to provide a better
visualization and perception to users in an AR scene. This needs to be explored further.
2. Developing better depth perception cues such as contour lines and meshes representing the depth of
excavation and topography of the ground surface respectively.
3. Machine Learning (ML) methods could be developed to automatically learn the site condition and
propose a suitable visualization method to the user, or automatically apply a suitable visualization
method in real time.
4. To tackle the localization problems in indoor or GNSS deprived area, further research needs to be done
on performing real-time registration of the virtual model and the real world.
5. New approaches are needed to overcome the parallax effect and avoid confusion in the AR
visualization when multiple utilities with depth are visualized in a dynamic situation.
26
This research did not receive any specific grant from funding agencies in the public, commercial, or not-
for-profit sectors.
27
References
28
Technology and Current State of the Art. Curr Rev Musculoskelet Med 2021, 14, 192-203,
doi:10.1007/s12178-021-09699-3.
18. Kim, J.J.; Wang, Y.; Wang, H.; Lee, S.; Yokota, T.; Someya, T. Skin Electronics: Next‐Generation
Device Platform for Virtual and Augmented Reality. Advanced Functional Materials 2021,
doi:10.1002/adfm.202009602.
19. Osadchyi, V.; Valko, N.; Kuzmich, L. Using augmented reality technologies for STEM education
organization. In Proceedings of the Journal of Physics: Conference Series, 2021; p. 012027.
20. Bec, A.; Moyle, B.; Schaffer, V.; Timms, K. Virtual reality and mixed reality for second chance
tourism. Tourism Management 2021, 83, doi:10.1016/j.tourman.2020.104256.
21. Gabbard, J.L.; Fitch, G.M.; Kim, H. Behind the Glass: Driver Challenges and Opportunities for AR
Automotive Applications. Proceedings of the IEEE 2014, 102, 124-136,
doi:10.1109/jproc.2013.2294642.
22. Regenbrecht, H.; Baratoff, G.; Wilke, W. Augmented reality projects in the automotive and
aerospace industries. IEEE computer graphics and applications 2005, 25, 48-56.
23. Halim, A.A. Applications of augmented reality for inspection and maintenance process in
automotive industry. Journal of Fundamental and Applied Sciences 2018, 10, 412-421.
24. Fan, Q.; Li, D. Augmented Reality Technology in Handball Teaching Based on Wireless
Communication Environment. 2021.
25. Önal, N.T.; Önal, N. The effect of augmented reality on the astronomy achievement and interest
level of gifted students. Education and Information Technologies 2021, doi:10.1007/s10639-021-10474-
7.
26. Furht, B. Handbook of Augmented Reality; 2011.
27. Livingston, M.A.; Ai, Z.; Karsch, K.; Gibson, G.O. User interface design for military AR
applications. Virtual Reality 2010, 15, 175-184, doi:10.1007/s10055-010-0179-1.
28. Matysczok, C.; Radkowski, R.; Berssenbruegge, J. AR-bowling: immersive and realistic game play
in real environments using augmented reality. In Proceedings of the Proceedings of the 2004 ACM
SIGCHI International Conference on Advances in computer entertainment technology, 2004; pp.
269-276.
29. Von Itzstein, G.S.; Billinghurst, M.; Smith, R.T.; Thomas, B.H. Augmented Reality Entertainment:
Taking Gaming Out of the Box. In Encyclopedia of Computer Graphics and Games; 2017; pp. 1-9.
30. Li, X.; Yi, W.; Chi, H.-L.; Wang, X.; Chan, A.P.C. A critical review of virtual and augmented reality
(VR/AR) applications in construction safety. Automation in Construction 2018, 86, 150-162,
doi:10.1016/j.autcon.2017.11.003.
31. Shin, D.H.; Dunston, P.S. Identification of application areas for Augmented Reality in industrial
construction based on technology suitability. Automation in Construction 2008, 17, 882-894,
doi:doi.org/10.1016/j.autcon.2008.02.012.
32. Behzadan, A.H.; Dong, S.; Kamat, V.R. Augmented reality visualization: A review of civil
infrastructure system applications. Advanced Engineering Informatics 2015, 29, 252-267,
doi:10.1016/j.aei.2015.03.005.
33. Becher, C.; Bottecchia, S.; Desbarats, P. Projection Grid Cues: An Efficient Way to Perceive the
Depths of Underground Objects in Augmented Reality. Cham, 2021; pp. 611-630.
34. Stylianidis, E.; Valari, E.; Pagani, A.; Carrillo, I.; Kounoudes, A.; Michail, K.; Smagas, K. Augmented
Reality Geovisualisation for Underground Utilities. PFG – Journal of Photogrammetry, Remote Sensing
and Geoinformation Science 2020, 88, 173-185, doi:10.1007/s41064-020-00108-x.
35. Ortega, S.; Wendel, J.; Santana, J.; Murshed, S.; Boates, I.; Trujillo, A.; Nichersu, A.; Suárez, J.
Making the Invisible Visible—Strategies for Visualizing Underground Infrastructures in
29
Immersive Environments. ISPRS International Journal of Geo-Information 2019, 8,
doi:10.3390/ijgi8030152.
36. Heinrich, F.; Bornemann, K.; Lawonn, K.; Hansen, C. Depth Perception in Projective Augmented
Reality: An Evaluation of Advanced Visualization Techniques. In Proceedings of the 25th ACM
Symposium on Virtual Reality Software and Technology, 2019; pp. 1-11.
37. Eren, M.T.; Balcisoy, S. Evaluation of X-ray visualization techniques for vertical depth judgments
in underground exploration. The Visual Computer 2017, 34, 405-416, doi:10.1007/s00371-016-1346-5.
38. Zhang, X.; Han, Y.; Hao, D.; Lv, Z. ARGIS-based outdoor underground pipeline information
system. Journal of Visual Communication and Image Representation 2016, 40, 779-790,
doi:https://doi.org/10.1016/j.jvcir.2016.07.011.
39. Zollmann, S.; Grasset, R.; Reitmayr, G.; Langlotz, T. Image-based X-ray visualization techniques
for spatial understanding in outdoor augmented reality. In Proceedings of the Proceedings of the
26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of
Design, 2014; pp. 194-203.
40. Zollmann, S.; Schall, G.; Junghanns, S.; Reitmayr, G. Comprehensible and Interactive
Visualizations of GIS Data in Augmented Reality. Berlin, Heidelberg, 2012; pp. 675-685.
41. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in
augmented reality. IEEE Computer Graphics and Applications 2001, 21, 34-47, doi:10.1109/38.963459.
42. Bouchlaghem, D.; Shang, H.; Whyte, J.; Ganah, A. Visualisation in architecture, engineering and
construction (AEC). Automation in Construction 2005, 14, 287-295, doi:10.1016/j.autcon.2004.08.012.
43. Retik, A.; Shapira, A. VR-based planning of construction site activities. Automation in Construction
1999, 8, 671-680.
44. Sampaio, A.Z.; Ferreira, M.M.; Rosário, D.P.; Martins, O.P. 3D and VR models in Civil Engineering
education: Construction, rehabilitation and maintenance. Automation in Construction 2010, 19, 819-
828.
45. Günther, T.; Franke, I.S.; Groh, R. Aughanded virtuality-the hands in the virtual environment. In
Proceedings of the 2015 IEEE Symposium on 3D User Interfaces (3DUI), 2015; pp. 157-158.
46. Nahon, D.; Subileau, G.; Capel, B. “Never Blind VR” enhancing the virtual reality headset
experience with augmented virtuality. In Proceedings of the 2015 IEEE Virtual Reality (VR), 2015;
pp. 347-348.
47. Neges, M.; Adwernat, S.; Abramovici, M. Augmented virtuality for maintenance training
simulation under various stress conditions. Procedia Manufacturing 2018, 19, 171-178.
48. Ternier, S.; Klemke, R.; Kalz, M.; Van Ulzen, P.; Specht, M. ARLearn: Augmented Reality Meets
Augmented Virtuality. J. UCS 2012, 18, 2143-2164.
49. Yu, D.; Jin, J.S.; Luo, S.; Lai, W.; Huang, Q. A Useful Visualization Technique: A Literature Review
for Augmented Reality and its Application, limitation & future direction. Boston, MA, 2010; pp.
311-337.
50. Berryman, D.R. Augmented reality: a review. Med Ref Serv Q 2012, 31, 212-218,
doi:10.1080/02763869.2012.670604.
51. Feng, Z.; Duh, H.B.; Billinghurst, M. Trends in augmented reality tracking, interaction and display:
A review of ten years of ISMAR. In Proceedings of the 2008 7th IEEE/ACM International
Symposium on Mixed and Augmented Reality, 15-18 Sept. 2008, 2008; pp. 193-202.
52. Farshid, M.; Paschen, J.; Eriksson, T.; Kietzmann, J. Go boldly! Business Horizons 2018, 61, 657-663,
doi:10.1016/j.bushor.2018.05.009.
53. Chalhoub, J.; Ayer, S.K. Using Mixed Reality for electrical construction design communication.
Automation in Construction 2018, 86, 1-10, doi:10.1016/j.autcon.2017.10.028.
30
54. Dai, F.; Olorunfemi, A.; Peng, W.; Cao, D.; Luo, X. Can mixed reality enhance safety communication
on construction sites? An industry perspective. Safety Science 2021, 133,
doi:10.1016/j.ssci.2020.105009.
55. Dunston, P.S.; Wang, X. Mixed reality-based visualization interfaces for architecture, engineering,
and construction industry. Journal of construction engineering and management 2005, 131, 1301-1309.
56. Hakkarainen, M.; Woodward, C.; Rainio, K. Software architecture for mobile mixed reality and 4D
BIM interaction. In Proceedings of the Proc. 25th CIB W78 Conference, 2009; pp. 1-8.
57. Riexinger, G.; Kluth, A.; Olbrich, M.; Braun, J.-D.; Bauernhansl, T. Mixed Reality for on-site self-
instruction and self-inspection with Building Information Models. Procedia cirp 2018, 72, 1124-1129.
58. Rokhsaritalemi, S.; Sadeghi-Niaraki, A.; Choi, S.-M. A Review on Mixed Reality: Current Trends,
Challenges and Prospects. Applied Sciences 2020, 10, doi:10.3390/app10020636.
59. Yilei, H. Evaluating mixed reality technology for architectural design and construction layout.
Journal of Civil Engineering and Construction Technology 2020, 11, 1-12, doi:10.5897/jcect2020.0534.
60. Alizadehsalehi, S.; Hadavi, A.; Huang, J.C. From BIM to extended reality in AEC industry.
Automation in Construction 2020, 116, doi:10.1016/j.autcon.2020.103254.
61. Banfi, F.; Brumana, R.; Stanga, C. Extended reality and informative models for the architectural
heritage: from scan-to-BIM process to virtual and augmented reality. Virtual Archaeology Review
2019, 10, doi:10.4995/var.2019.11923.
62. Doolani, S.; Wessels, C.; Kanal, V.; Sevastopoulos, C.; Jaiswal, A.; Nambiappan, H.; Makedon, F. A
Review of Extended Reality (XR) Technologies for Manufacturing Training. Technologies 2020, 8,
doi:10.3390/technologies8040077.
63. Fast-Berglund, Å.; Gong, L.; Li, D. Testing and validating Extended Reality (xR) technologies in
manufacturing. Procedia Manufacturing 2018, 25, 31-38.
64. Chuah, S.H.-W. Why and who will adopt extended reality technology? Literature review,
synthesis, and future research agenda. Literature Review, Synthesis, and Future Research Agenda
(December 13, 2018) 2018.
65. Varjo. XR-3 and VR-3 User Guide. 2021, https://varjo.com/products/xr-3/.
66. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the Proceedings of
the December 9-11, 1968, fall joint computer conference, part I, San Francisco, California, 1968; pp.
757–764.
67. Schranz, C.; Urban, H.; Gerger, A. Potentials of Augmented Reality in a BIM based building
submission process. J. Inf. Techn. Construction (ITcon) 2021, 26, 441-457.
68. Davila Delgado, J.M.; Oyedele, L.; Demian, P.; Beach, T. A research agenda for augmented and
virtual reality in architecture, engineering and construction. Advanced Engineering Informatics 2020,
45, 101122, doi:https://doi.org/10.1016/j.aei.2020.101122.
69. Noghabaei, M.; Heydarian, A.; Balali, V.; Han, K. Trend Analysis on Adoption of Virtual and
Augmented Reality in the Architecture, Engineering, and Construction Industry. Data 2020, 5, 26,
https://www.mdpi.com/2306-5729/5/1/26.
70. Albahbah, M.; Kıvrak, S.; Arslan, G. Application areas of augmented reality and virtual reality in
construction project management: A scoping review. Journal of Construction Engineering 2021, 4,
151-172.
71. Alizadehsalehi, S.; Yitmen, I. Digital twin-based progress monitoring management model through
reality capture to extended reality technologies (DRX). Smart and Sustainable Built Environment 2021,
ahead-of-print, doi:10.1108/SASBE-01-2021-0016.
72. Dudhee, V.; Vukovic, V. Building information model visualisation in augmented reality. Smart and
Sustainable Built Environment 2021, ahead-of-print, doi:10.1108/SASBE-02-2021-0021.
31
73. Mladenov, B.; Damiani, L.; Giribone, P.; Revetria, R. A short review of the SDKs and wearable
devices to be used for ar application for industrial working environment. In Proceedings of the
Proceedings of the World Congress on Engineering and Computer Science, 2018; pp. 23-25.
74. Tongprasom, K.; Boongsood, W.; Boongsood, W.; Pipatchotitham, T. Comparative Study of an
Augmented Reality Software Development Kit Suitable for Forensic Medicine Education.
International Journal of Information and Education Technology 2021, 11.
75. Amin, D.; Govilkar, S. Comparative study of augmented reality SDKs. International Journal on
Computational Science & Applications 2015, 5, 11-26.
76. Hoover, M. An evaluation of the Microsoft HoloLens for a manufacturing-guided assembly task.
2018.
77. Makhataeva, Z.; Varol, H.A. Augmented reality for robotics: a review. Robotics 2020, 9, 21.
78. Schwerdtfeger, B.; Pustka, D.; Hofhauser, A.; Klinker, G. Using laser projectors for augmented
reality. In Proceedings of the Proceedings of the 2008 ACM symposium on Virtual reality software
and technology, Bordeaux, France, 2008; pp. 134–137.
79. Olwal, A.; Gustafsson, J.; Lindfors, C. Spatial augmented reality on industrial CNC-machines. In
Proceedings of the The Engineering Reality of Virtual Reality 2008, 2008; p. 680409.
80. Cheng, J.; Chen, K.; Chen, W. Comparison of marker-based AR and marker-less AR: a case study
on indoor decoration system. In Proceedings of the Lean and Computing in Construction Congress
(LC3): Proceedings of the Joint Conference on Computing in Construction (JC3), 2017; pp. 483-490.
81. Romli, R.; Razali, A.F.; Ghazali, N.H.; Hanin, N.A.; Ibrahim, S.Z. Mobile Augmented Reality (AR)
Marker-based for Indoor Library Navigation. IOP Conference Series: Materials Science and
Engineering 2020, 767, 012062, doi:10.1088/1757-899x/767/1/012062.
82. Boonbrahm, S.; Boonbrahm, P.; Kaewrat, C. The Use of Marker-Based Augmented Reality in Space
Measurement. Procedia Manufacturing 2020, 42, 337-343,
doi:https://doi.org/10.1016/j.promfg.2020.02.081.
83. Hübner, P.; Weinmann, M.; Wursthorn, S. Marker-based localization of the microsoft hololens in
building models. International Archives of the Photogrammetry, Remote Sensing & Spatial Information
Sciences 2018, 42.
84. Abhishek, M.T.; Aswin, P.S.; Akhil, N.C.; Souban, A.; Muhammedali, S.K.; Vial, A. Virtual Lab
Using Markerless Augmented Reality. In Proceedings of the 2018 IEEE International Conference
on Teaching, Assessment, and Learning for Engineering (TALE), 4-7 Dec. 2018, 2018; pp. 1150-1153.
85. Comport, A.I.; Marchand, E.; Chaumette, F. A real-time tracker for markerless augmented reality.
In Proceedings of the The Second IEEE and ACM International Symposium on Mixed and
Augmented Reality, 2003. Proceedings., 10-10 Oct. 2003, 2003; pp. 36-45.
86. Liu, F.; Seipel, S. Precision study on augmented reality-based visual guidance for facility
management tasks. Automation in Construction 2018, 90, 79-90, doi:10.1016/j.autcon.2018.02.020.
87. Piroozfar, P.; Judd, A.; Boseley, S.; Essa, A.; Farr, E.R.P. Augmented Reality (AR) for Utility
Infrastructure: An Experiential Development Workflow. Cham, 2021; pp. 527-533.
88. Schall, G.; Schmalstieg, D.; Junghanns, S. VIDENTE-3D Visualization of Underground
Infrastructure using Handheld Augmented Reality. 2010.
89. Glocker, B.; Shotton, J.; Criminisi, A.; Izadi, S. Real-time RGB-D camera relocalization via
randomized ferns for keyframe encoding. IEEE transactions on visualization and computer graphics
2014, 21, 571-583.
90. Jinyu, L.; Bangbang, Y.; Danpeng, C.; Nan, W.; Guofeng, Z.; Hujun, B. Survey and evaluation of
monocular visual-inertial SLAM algorithms for augmented reality. Virtual Reality & Intelligent
Hardware 2019, 1, 386-410, doi:10.1016/j.vrih.2019.07.002.
32
91. Choi, H.-B.; Lim, K.-W.; Ko, Y.-B. Improved Virtual Anchor Selection for AR-assisted Sensor
Positioning in Harsh Indoor Conditions. In Proceedings of the 2020 Global Internet of Things
Summit (GIoTS), 2020; pp. 1-6.
92. David, P.; Dementhon, D.; Duraiswami, R.; Samet, H. SoftPOSIT: Simultaneous pose and
correspondence determination. International Journal of Computer Vision 2004, 59, 259-284.
93. Lepetit, V.; Fua, P. Monocular model-based 3D tracking of rigid objects; Now Publishers Inc: 2005.
94. Li, C.; Kang, Z.; Yang, J.; Li, F.; Wang, Y. Research on Semantic-Assisted Slam in Complex Dynamic
Indoor Environment. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial
Information Sciences 2020, XLIII-B4-2020, 353-359, doi:10.5194/isprs-archives-XLIII-B4-2020-353-
2020.
95. Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular
SLAM System. IEEE Transactions on Robotics 2015, 31, 1147-1163, doi:10.1109/tro.2015.2463671.
96. Ortiz-Fernandez, L.E.; Cabrera-Avila, E.V.; Silva, B.; Goncalves, L.M.G. Smart Artificial Markers
for Accurate Visual Mapping and Localization. Sensors (Basel) 2021, 21, doi:10.3390/s21020625.
97. Piao, J.-C.; Kim, S.-D. Adaptive Monocular Visual–Inertial SLAM for Real-Time Augmented
Reality Applications in Mobile Devices. Sensors 2017, 17, 2567, https://www.mdpi.com/1424-
8220/17/11/2567.
98. Ramezani, M.; Acharya, D.; Gu, F.; Khoshelham, K. Indoor Positioning by Visual-Inertial
Odometry. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences 2017, IV-
2/W4, 371-376, doi:10.5194/isprs-annals-IV-2-W4-371-2017.
99. Ramezani, M.; Khoshelham, K.; Fraser, C. Pose estimation by Omnidirectional Visual-Inertial
Odometry. Robotics and Autonomous Systems 2018, 105, 26-37, doi:10.1016/j.robot.2018.03.007.
100. Acharya, D.; Khoshelham, K.; Winter, S. BIM-PoseNet: Indoor camera localisation using a 3D
indoor model and deep learning from synthetic images. ISPRS Journal of Photogrammetry and Remote
Sensing 2019, 150, 245-258.
101. Acharya, D.; Ramezani, M.; Khoshelham, K.; Winter, S. BIM-Tracker: A model-based visual
tracking approach for indoor localisation using a 3D building model. ISPRS Journal of
Photogrammetry and Remote Sensing 2019, 150, 157-171, doi:10.1016/j.isprsjprs.2019.02.014.
102. Acharya, D.; Singha Roy, S.; Khoshelham, K.; Winter, S. A Recurrent Deep Network for Estimating
the Pose of Real Indoor Images from Synthetic Image Sequences. Sensors 2020, 20, 5492.
103. Ramezani, M.; Khoshelham, K. Vehicle Positioning in GNSS-Deprived Urban Areas by Stereo
Visual-Inertial Odometry. IEEE Transactions on Intelligent Vehicles 2018, 3, 208-217,
doi:10.1109/tiv.2018.2804168.
104. Ramezani, M.; Khoshelham, K.; Fraser, C. Omnidirectional visual-inertial odometry using multi-
state constraint Kalman filter. Robotics and Autonomous Systems 2018, 105,
doi:10.1016/j.robot.2018.03.007.
105. Rovira, A.; Fatah gen Schieck, A.; Blume, P.; Julier, S. Guidance and surroundings awareness in
outdoor handheld augmented reality. PLOS ONE 2020, 15, e0230518,
doi:10.1371/journal.pone.0230518.
106. Livingston, M.A.; Ai, Z.; Swan, J.E.; Smallman, H.S. Indoor vs. Outdoor Depth Perception for
Mobile Augmented Reality. In Proceedings of the 2009 IEEE Virtual Reality Conference, 14-18
March 2009, 2009; pp. 55-62.
107. VGIS. Understanding AR visuals. 2021, https://support.vgis.io/hc/en-us/articles/360023886274-
Understanding-augmented-reality-visuals-KB-TT004-.
108. Autodesk. Autodesk Industry collections. Available online:
https://www.autodesk.com.au/collections (accessed on
33
109. Perey, C.; Engelke, T.; Reed, C. Current Status of Standards for Augmented Reality. 2011; pp. 21-
38.
110. Liao, T. Standards and Their (Recurring) Stories: How Augmented Reality Markup Language Was
Built on Stories of Past Standards. Science, Technology, & Human Values 2020, 45, 712-737,
doi:10.1177/0162243919867417.
111. Saran, S.; Oberai, K.; Wate, P.; Konde, A.; Dutta, A.; Kumar, K.; Kumar, A.S. Utilities of virtual 3D
city models based on CityGml: various use cases. Journal of the indian society of remote sensing 2018,
46, 957-972.
112. Qiao, X.; Ren, P.; Dustdar, S.; Chen, J. A New Era for Web AR with Mobile Edge Computing. IEEE
Internet Computing 2018, 22, 46-55, doi:10.1109/MIC.2018.043051464.
113. Qiao, X.; Ren, P.; Dustdar, S.; Liu, L.; Ma, H.; Chen, J. Web AR: A Promising Future for Mobile
Augmented Reality—State of the Art, Challenges, and Insights. Proceedings of the IEEE 2019, 107,
651-666, doi:10.1109/JPROC.2019.2895105.
114. Ahn, S.; Ko, H.; Yoo, B. Webizing mobile augmented reality content. New Review in Hypermedia and
Multimedia 2014, doi:10.1080/13614568.2013.857727.
115. Consortium, O.G. OGC® Augmented Reality Markup Language 2.0 (ARML 2.0). 2015,
http://docs.opengeospatial.org/is/12-132r4/12-132r4.html.
116. Skarbes, R.; Smith, M.; Whitton, M. Mixed Reality Doesn't Need Standardized Evaluation Methods.
2021.
117. Ritsos, P.; Ritsos, P.; Gougoulis, A. Standards for Augmented Reality: a User Experience perspective;
2011.
118. Hua, H.; Javidi, B. Augmented reality: easy on the eyes. Optics and Photonics News 2015, 26, 26-33.
119. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An
Enhanced Method for Quantifying Simulator Sickness. International Journal of Aviation Psychology
1993, 3, 203, doi:10.1207/s15327108ijap0303_3.
120. Kim, S.; Nussbaum, M.A.; Gabbard, J.L. Augmented Reality “Smart Glasses” in the Workplace:
Industry Perspectives and Challenges for Worker Safety and Health. IIE Transactions on
Occupational Ergonomics and Human Factors 2016, 4, 253-258, doi:10.1080/21577323.2016.1214635.
121. Chen, Y.; Wang, X.; Xu, H. Human factors/ergonomics evaluation for virtual reality headsets: a
review. CCF Transactions on Pervasive Computing and Interaction 2021, 1-13.
122. Arévalo Arboleda, S.; Dierks, T.; Rücker, F.; Gerken, J. Exploring the Visual Space to Improve
Depth Perception in Robot Teleoperation Using Augmented Reality: The Role of Distance and
Target’s Pose in Time, Success, and Certainty. Cham, 2021; pp. 522-543.
123. Čopič Pucihar, K.; Coulton, P.; Alexander, J. Creating a stereoscopic magic-lens to improve depth
perception in handheld augmented reality. In Proceedings of the Proceedings of the 15th
international conference on Human-computer interaction with mobile devices and services, 2013;
pp. 448-451.
124. Kytö, M.; Mäkinen, A.; Häkkinen, J.; Oittinen, P. Improving relative depth judgments in
augmented reality with auxiliary augmentations. ACM Transactions on Applied Perception 2013, 10,
1-21, doi:10.1145/2422105.2422111.
125. Li, H.; Wang, W.; Ma, W.; Zhang, G.; Wang, Q.; Qu, J. Design and analysis of depth cues on depth
perception in interactive mixed reality simulation systems. Journal of the Society for Information
Display n/a, doi:https://doi.org/10.1002/jsid.1074.
126. Ping, J.; Thomas, B.H.; Baumeister, J.; Guo, J.; Weng, D.; Liu, Y. Effects of shading model and
opacity on depth perception in optical see‐through augmented reality. Journal of the Society for
Information Display 2020, 28, 892-904, doi:10.1002/jsid.947.
34
127. Hertel, J.; Steinicke, F. Augmented Reality for Maritime Navigation Assistance - Egocentric Depth
Perception in Large Distance Outdoor Environments. In Proceedings of the 2021 IEEE Virtual
Reality and 3D User Interfaces (VR), 27 March-1 April 2021, 2021; pp. 122-130.
128. Gombač, L.; Pucihar, K.Č.; Kljun, M.; Coulton, P.; Grbac, J. 3D Virtual Tracing and Depth
Perception Problem on Mobile AR. In Proceedings of the Proceedings of the 2016 CHI Conference
Extended Abstracts on Human Factors in Computing Systems, San Jose, California, USA, 2016; pp.
1849–1856.
129. Shibata, T. Head mounted display. Displays 2002, 23, 57-64, doi:https://doi.org/10.1016/S0141-
9382(02)00010-0.
130. Sahu, C.K.; Young, C.; Rai, R. Artificial intelligence (AI) in augmented reality (AR)-assisted
manufacturing applications: a review. International Journal of Production Research 2021, 59, 4903-
4959, doi:10.1080/00207543.2020.1859636.
131. Aleksy, M.; Troost, M.; Scheinhardt, F.; Zank, G.T. Utilizing HoloLens to Support Industrial Service
Processes. In Proceedings of the 2018 IEEE 32nd International Conference on Advanced
Information Networking and Applications (AINA), 2018; pp. 143-148.
132. Hübner, P.; Clintworth, K.; Liu, Q.; Weinmann, M.; Wursthorn, S. Evaluation of HoloLens Tracking
and Depth Sensing for Indoor Mapping Applications. Sensors (Basel, Switzerland) 2020, 20, 1021,
doi:10.3390/s20041021.
133. Su, X.; Talmaki, S.; Cai, H.; Kamat, V.R. Uncertainty-aware visualization and proximity monitoring
in urban excavation: a geospatial augmented reality approach. Visualization in Engineering 2013, 1,
2, doi:10.1186/2213-7459-1-2.
134. Côté, S.; Girard-Vallée, A. Accurate OnSite Georeferenced Subsurface Utility Model Visualisation.
Cham, 2015; pp. 63-70.
135. Côté, S.; Létourneau, I.; Marcoux-Ouellet, J. [Poster] Augmentation of live excavation work for
subsurface utilities engineering. In Proceedings of the 2014 IEEE International Symposium on
Mixed and Augmented Reality (ISMAR), 10-12 Sept. 2014, 2014; pp. 259-260.
136. Eren, M.T.; Cansoy, M.; Balcisoy, S. Multi-view augmented reality for underground exploration.
In Proceedings of the 2013 IEEE Virtual Reality (VR), 18-20 March 2013, 2013; pp. 117-118.
137. Hansen, L.H.; Wyke, S.S.; Kjems, E. Combining Reality Capture and Augmented Reality to
Visualise Subsurface Utilities in the Field. In Proceedings of the ISARC. Proceedings of the
International Symposium on Automation and Robotics in Construction, 2020; pp. 703-710.
138. Soria, G.; Ortega Alvarado, L.M.; Feito, F.R. Augmented and Virtual Reality for Underground
Facilities Management. Journal of Computing and Information Science in Engineering 2018, 18,
doi:10.1115/1.4040460.
139. Feiner, S.K.; Seligmann, D.e.D. CutawaysAnd Ghosting Satisfying Visibility Constraints
InDynamic 3d Illustrations. The Visual Computer 1992, 8, 292-302, doi:10.1007/BF01897116.
140. Hansen, L.H.; Fleck, P.; Stranner, M.; Schmalstieg, D.; Arth, C. Augmented Reality for Subsurface
Utility Engineering, Revisited. IEEE Transactions on Visualization and Computer Graphics 2021, 27,
4119-4128.
141. Cote, S.; Mercier, A. Augmentation of road surfaces with subsurface utility model projections. In
Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2018; pp.
535-536.
142. Chen, J.; Granier, X.; Lin, N.; Peng, Q. On-line visualization of underground structures using
Context features. Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology,
Hong Kong 2010, 167–170, doi:10.1145/1889863.1889898.
35
143. Kalkofen, D.; Mendez, E.; Schmalstieg, D. Interactive focus and context visualization for
augmented reality. In Proceedings of the 2007 6th IEEE and ACM International Symposium on
Mixed and Augmented Reality, 2007; pp. 191-201.
144. Baek, J.-M.; Hong, I.-S. The Design of an Automatically Generated System for Cross Sections of
Underground Utilities using Augmented Reality. International Journal of Smart Home 2013, 7, 255-
264, doi:10.14257/ijsh.2013.7.6.25.
145. Linford, N. Rapid processing of GPR time slices for data visualisation during field acquisition. In
Proceedings of the Proceedings of the 15th International Conference on Ground Penetrating Radar,
30 June-4 July 2014, 2014; pp. 702-706.
146. Ayala Cabrera, D.; Herrera Fernández, A.M.; Izquierdo Sebastián, J.; Pérez García, R.; Ocaña-
Levario, S. Dynamic management of water supply systems: A tool to build scenarios by merging
GPR surveys and augmented reality. Water Utility Journal 2013, 6, 3-8.
36
Figures
37
38
Minerva Access is the Institutional Repository of The University of Melbourne
Author/s:
Muthalif, MZA;Shojaei, D;Khoshelham, K
Title:
A review of augmented reality visualization methods for subsurface utilities
Date:
2022-01
Citation:
Muthalif, M. Z. A., Shojaei, D. & Khoshelham, K. (2022). A review of augmented reality
visualization methods for subsurface utilities. ADVANCED ENGINEERING INFORMATICS,
51, https://doi.org/10.1016/j.aei.2021.101498.
Persistent Link:
http://hdl.handle.net/11343/311323