3D Visualization of Weather Radar Data: Aron Ernvik
3D Visualization of Weather Radar Data: Aron Ernvik
3D Visualization of Weather Radar Data: Aron Ernvik
Aron Ernvik
lith-isy-ex-3252-2002
januari 2002
3D Visualization
of
Weather Radar Data
Thesis project in computer graphics at Linköping University
by
Aron Ernvik
lith-isy-ex-3252-2002
Examiner: Ingemar Ragnemalm
Linköping, January 2002
Avdelning, Institution Datum
Division, Department Date
2002-01-15
Sammanfattning
Abstract
There are 12 weather radars operated jointly by SMHI and the Swedish Armed Forces in Sweden.
Data from them are used for short term forecasting and analysis. The traditional way of viewing
data from the radars is in 2D images, even though 3D polar volumes are delivered from the radars.
The purpose of this work is to develop an application for 3D viewing of weather radar data.
There are basically three approaches to visualization of volumetric data, such as radar data: slicing
with cross-sectional planes, surface extraction, and volume rendering. The application developed
during this project supports variations on all three approaches. Different objects, e.g. horizontal
and vertical planes, isosurfaces, or volume rendering objects, can be added to a 3D scene and
viewed simultaneously from any angle. Parameters of the objects can be set using a graphical user
interface and a few different plots can be generated.
Compared to the traditional 2D products used by meteorologists when analyzing radar data, the
3D scenes add information that makes it easier for the users to understand the given weather
situations. Demonstrations and discussions with meteorologists have rendered positive reactions.
The application will be installed and evaluated at Arlanda airport in Sweden.
Nyckelord
Keyword
visualisering, väder, radar, datorgrafik, 3d, visualization, weather, graphics
Abstract
There are 12 weather radars operated jointly by smhi and the Swedish
Armed Forces in Sweden. Data from them are used for short term
forecasting and analysis. The traditional way of viewing data from the
radars is in 2D images, even though 3D polar volumes are delivered
from the radars. The purpose of this work is to develop an application
for 3D viewing of weather radar data.
1 Introduction 1
1.1 Background 1
1.2 Purpose 1
1.3 Order of work 1
1.4 Outline of this thesis 2
2 Introduction to weather radars 5
2.1 Background 5
2.2 Radar hardware 5
2.2.1 The transmitter 6
2.2.2 The antenna 7
2.2.3 The waveguide 8
2.2.4 The transmit/receive switch and the receiver 8
2.3 Radar parameters 8
2.4 The radar equation 10
2.5 Doppler velocity measurements 12
2.6 Radar data 13
2.7 Range, height, and distance 15
3 RAVE - current and future use 19
3.1 Background 19
3.2 System design 19
3.2.1 Graphical user interface 20
3.3 Products from 3D radar data 21
3.3.1 The PPI product 21
3.3.2 The CAPPI product 22
3.3.3 The PCAPPI product 23
3.3.4 The RHI product 24
3.3.5 The VAD product 25
3.4 RAVE users 26
4 Visualization of volumetric data 29
4.1 Introduction 29
4.2 Polar and cartesian volumes 30
4.2.1 Gaussian splatting 30
4.2.2 Shepard’s method 30
4.3 Different approaches 31
4.4 Interpolation 32
4.4.1 Nearest neighbour interpolation 32
4.4.2 Trilinear interpolation 32
4.5 Slicing techniques 33
4.6 Surface rendering techniques 34
4.6.1 The marching cubes algorithm 34
4.7 Volume rendering techniques 36
4.7.1 Object-order techniques 37
4.7.2 Image-order techniques 38
4.8 Shading 41
4.8.1 Polygon shading 42
5 Implementation 45
5.1 Visualization software 45
5.1.1 Visual programming systems 46
5.1.2 The Visualization Toolkit, VTK 49
5.1.3 3D graphics APIs 49
5.2 Choice of software 49
5.3 Software design 50
5.4 Visualization objects 52
5.4.1 The topography object 52
5.4.2 The radar, bounding box and axis objects 52
5.4.3 The elevation surface object 53
5.4.4 The RHI object 54
5.4.5 The CAPPI object 55
5.4.6 The glyphs object 56
5.4.7 The winds object 57
5.4.8 The VAD object 57
5.4.9 The isosurface object 58
5.4.10 The volume rendering object 59
5.4.11 The radar ray object 61
6 Evaluation and conclusions 63
6.1 Introduction 63
6.2 Evaluation 63
6.2.1 Usefulness 63
6.2.2 Performance 65
6.3 Future improvements 66
6.4 Conclusions 67
7 Resources 69
7.1 Books and articles 69
7.2 Internet 70
A Class Diagrams 71
List of figures
Figure 1. This is what a radar looks like from the outside. This one is positioned
near the southern tip of Sweden’s largest island, Gotland. 6
Figure 2. Simple block diagram of a radar. (Adapted from Rinehart, 1991) 7
Figure 3. The surface generated by a radar scan when the elevation angle is held
constant and the azimuth angle varies from 0 to 360 degrees. 14
Figure 4. Range, height, and distance. 16
Figure 5. The basic system design in RAVE. 20
Figure 6. A PPI image and the main graphical user interface in RAVE. 22
Figure 7. Generating a CAPPI image. The output image is orthogonal to the
paper. 23
Figure 8. A CAPPI (left) and a PCAPPI (right) image, showing data from the
same location, time, and altitude (2 500 metres). 24
Figure 9. Generating a RHI image. The output image is parallel to the paper. 24
Figure 10. VAD circles. 25
Figure 11. Velocity/azimuth display of the Doppler velocity along a VAD circle.26
Figure 12. Pixels and voxels. 29
Figure 13. Nearest neighbour interpolation. 32
Figure 14. Trilinear interpolation. 33
Figure 15. The marching cubes. 36
Figure 16. Object-order volume rendering (forward mapping). The volume is
transformed and mapped into image space. 37
Figure 17. Image-order volume rendering (inverse mapping). The image plane is
transformed and mapped onto the volume. 39
Figure 18. Perspective ray casting. 40
Figure 19. Block diagram of a typical visualization system. 46
Figure 20. IRIS Explorer user interface. 48
Figure 21. Basic system design. 50
Figure 22. Main GUI screenshot when a few objects have been added. 51
Figure 23. Topography, radar, bounding box, and height axis objects. 53
Figure 24. Creating an elevation surface. Radar as seen from above. 54
Figure 25. A range/height indicator, or rhi, object, and its manager. 55
Figure 26. A semitransparent CAPPI object and its manager. 56
Figure 27. A glyphs object. 57
Figure 28. A winds object, as seen from above. 58
Figure 29. A VAD object. 58
Figure 30. Two isosurface objects, with isovalues 14.5 and 18.5 dBZ for outer and
inner surface, respectively. 59
Figure 31. The volume rendering manager. 60
Figure 32. A volume rendering object. 61
Figure 33. A radar ray object and its manager. 61
Figure 34. Graphical user interface class diagram. 72
Figure 35. Visualization objects class diagram. 75
List of tables
Table 1. Radar bands and their frequencies and wavelengths. From Rinehart
(1991). Visible light wavelegths range from 400 to 700 nm. 9
Table 2. Two different computer configurations tested with RAVE. 65
1
Introduction
1.1 Background
Swedish Meteorological and Hydrological Institute, smhi, runs a
network of 12 weather radars together with the Swedish Armed
Forces. Data from them are used for short term weather forecasting
and analysis; for detecting wind speeds and directions as well as rain
fall amounts. The radars deliver data in the form of 3D polar volumes
four times an hour, day and night, every day of the year. In order to
meet the demand for a user friendly tool to visualize and analyse the
data, the development of a software system called rave (which is an
abbreviation for Radar Analysis and Visualization Environment) was
initiated in 1996. From the beginning rave was designed to, among
other things, “create and visualize arbitrary 2D products based on 3D
data” and to be able to “render 3D views of polar and cartesian volume
data with and without backdrops” (Bolin & Michelson, 1996). During
1997 a first version of rave was implemented that included basic
functionality but it was not able to render any 3D views - neither is
the current version of rave. This project aims to fill this hole.
1.2 Purpose
The purpose of this work is to design and implement a generic,
object-oriented application for 3D visualization of weather radar data.
The application should be integrated with existing software.
1 of 76
1. Literature studies. Searching for and reading some of what has
been written about 3D visualization of volumetric data in general,
and radar data in particular.
2. User interviews. Finding out what features the users of the applica-
tion want.
3. Data analysis. Investigating the nature of weather radar data.
4. Platform comparison. Comparing some software platforms and
deciding which one is most suitable for the purpose.
5. Experimental programming. Learning to develop software using
the chosen software environment.
6. Implementation of basic features. A standard pc workstation
should be used and the performance needs to be reviewed before
any advanced features, such as rendering sequences of radar data,
are implemented.
7. Implementation of advanced features, given the limited time of the
project.
8. Evaluation.
2 of 76
There is also an appendix: Appendix A, Class diagrams. This appendix
contains a couple of class diagrams and textual descriptions of these.
3 of 76
4 of 76
2
Introduction to
weather radars
2.1 Background
Radar is an acronym for radio detection and ranging. Radar devices
transmit electromagnetic waves, receive echoes from targets in the
surroundings and determine various things about the targets from the
characteristics of the received echo signals. Radar technology was
developed primarily before and during the Second World War. Back
then, the main goal was to scan the air for enemy airplanes. Radar
technology is still being used for aircraft detection but nowadays radar
is also used for a variety of other purposes - one of which is for detec-
tion of severe weather, including heavy thundershowers, hail, torna-
does, hurricanes and strong wind storms.
This chapter will give a brief overview of the components and impor-
tant parameters a radar system and an equally brief discussion of the
radar equation. This equation describes the amount of power that is
received back at the radar in terms of wavelength, particle size and a
few other things. Finally, something is told about the data delivered
from a typical radar in the Swedish network.
5 of 76
Figure 1. This is what a radar looks like from the outside. This one is
positioned near the southern tip of Sweden’s largest island,
Gotland.
6 of 76
Figure 2. Simple block diagram of a radar. (Adapted from Rinehart, 1991)
REFLECTOR
ANTENNA
FEEDHORN
WAVEGUIDE
TRANSMIT/RECEIVE
SWITCH
7 of 76
2.2.3 The waveguide
The conductor connecting the radar transmitter and antenna is called
a waveguide. This is usually a hollow, rectangular, metal conductor
whose interior dimensions depend on the wavelength of the signals
being carried. Coaxial cables are too lossy for these purposes at the
high frequencies used by radars. (Ibid.)
8 of 76
c
f = --- (Eq 1)
λ
p
G ( dB ) = 10 log ----1- (Eq 2)
10 p2
9 of 76
eter is the antenna beamwidth. Antenna gain and antenna beamwidth
are related as given by
2 2
π ⋅k
g = --------------- (Eq 3)
θ⋅ϕ
pt
S = -----------2 (Eq 4)
4πr
where pt is the transmitted power, and r is the distance from the radar.
To get the power pσ received by a target with area Aσ when a non-iso-
tropic antenna with gain g is used, all that needs to be done is to mul-
tiply S with Aσ and g - assuming the target is positioned along the
center of the radar beam axis:
p t gAσ
p σ = --------------
2
(Eq 5)
4πr
10 of 76
pσ Ae p t gA σ A e
p r = -----------2- = --------------------
2 4
- (Eq 6)
4πr ( 4π ) r
where Ae is the effective area of the receiving antenna. This area can
be expressed in terms of the antenna gain and the wavelength λ of the
radar:
2
gλ
A e = --------- (Eq 7)
4π
One last refinement that will yield the radar equation for point targets
has to do with the area of the target, Aσ. This area is not necessarily
the size the target appears to the radar. Instead, a parameter called the
backscattering crossectional area of the target is used, and it is called σ.1
The radar equation for a point target located on the center of the
antenna beam pattern can thus be written as
2 2
pt g λ σ
p r = -------------------
3 4
- (Eq 8)
64π r
1. σ depends not only on the size, shape and kind of matter making up the target but also of
the radar wavelength. For details, turn to Rinehart (1991).
2. For the most commonly used radar wavelength, |K|2 for water is around 0.93 and for ice
it’s 0.20 - this difference surely affects pr!
11 of 76
closer to 1 than to 0. l is often neglected in the calculations (i.e. it is
considered to be equal to 1). z, finally, is called the reflectivity factor
and this is calculated as
6
z = ∑D (Eq 10)
where the sum is over all the particles in a unit volume and D is the
radius of a particle (e.g. a typical raindrop).
For a single target at distance r from the radar, the radar wave will
travel a distance of 2r - to the target and back to the radar. Using
12 of 76
wavelength λ, this corresponds to 2r/λ wavelengths, or, since one
wavelength equals 2π radians, 4πr/λ radians. So if the radar signal is
transmitted with an initial phase of ϕ0, the phase of the received sig-
nal will be
4πr
ϕ = ϕ 0 + --------- (Eq 12)
λ
The time derivative of this phase, i.e. the change of phase from one
pulse to the next, is given by
dϕ 4π dr
= ------ ⋅ (Eq 13)
dt λ dt
2v
f = ------ (Eq 14)
λ
13 of 76
used and the transmitter and antenna loop through the whole azi-
muth span of 360 degrees. The data received from one such revolu-
tion, with the elevation angle held constant, is shown in Figure 3.
Here, the colours on the surface correspond to different reflectivities
(z in the radar equation). The upper half of the surface is mostly clear
air with few echoes. In this figure, the earth is assumed to be flat -
hence the rays bend slightly upwards1.
360 240
12 ⋅ --------- ⋅ --------- = 576000 (Eq 15)
0.9 2
Figure 3. The surface generated by a radar scan when the elevation angle is
held constant and the azimuth angle varies from 0 to 360
degrees..
1. The gaseous composition of the atmosphere also affects ray bending (according to Snell’s
law). See Section 2.7 on page 15.
14 of 76
The samples in a polar volume may be arranged in a three-dimen-
sional array to implicitly associate each value with its origin in space:
data = float[elevationAngles][azimuthAngles][rangeBins]
dϕ 1
= --- (Eq 16)
dS R
Here, ϕ is the angle between the tangent of the circle arc and some fix
axis and dS is a short distance along the arc.
n – dn sin i u
--------------- = ---------- = -----i (Eq 17)
n sin r ur
15 of 76
For a radar ray travelling in a non-uniform atmosphere, the ray will
bend more or less relative to the earth, depending on how much the
refractive index changes with height. The curvature of a radar ray rel-
ative to the earth’s surface is then
dϕ 1 dn
= --- + (Eq 18)
dS R dh
where the last term describes how the refractive index of the atmos-
phere varies with the height. Under normal circumstances, this deriv-
ative is negative - the refractive index is typically around 1.0003 at sea
level and 1.0 in outer space. The effect is then that radar rays appear
to bend downward and their curvature relative to the surface of the
earth is less than 1/R. It is a good approximation to use the following
approximation of the curvature (Rinehart, 1991):
1- 1
--- = ------- (Eq 19)
R' 4
--- R
3
Using this curvature, the range, height, and distance variables may be
defined as shown in the grossly exaggerated Figure 4. The range of a
certain radar sample is the distance from the radar to the sample,
along the ray. The distance is the corresponding distance from the
radar along the earth’s surface, to the point directly beneath the sam-
ple. And the height of the sample is simply the sample’s height above
the surface of the earth.
d h
16 of 76
x = – d ⋅ cos α
y = d ⋅ sin α (Eq 20)
z = h
17 of 76
18 of 76
3
RAVE - current and
future use
3.1 Background
The rave project was launched in 1996. The main goal was to
develop an application for analysis and visualization of radar data at
smhi, and possibly for its international equivalents in the Baltic
region1. Some basic work was done as a master’s thesis by Håkan
Bolin during the first six months of the project. Since then, the sys-
tem has been extended with various plugins and functions to import
data from different radar data formats, etc. But until now, there have
been no 3D visualization possibilities, although it was stated from the
beginning that there should be.
19 of 76
since Python is a platform-independent language. A few c modules
would have to be recompiled.
HTTP
CONNECTION
20 of 76
occupying the larger part of the window. A new item, Volume visuali-
zation, is added to the products menu with this project, allowing 3D
views of polar volumes. The 3D graphics is displayed in a new win-
dow. More on this later!
Using the View and Options menus, the user can choose what features
should be included in the currently displayed window. Options
include a background that shows the surrounding land and water, or a
vector graphics overlay of coastlines, borders, lakes, and the like. On
top of that, the user can select what colour legend to use, and whether
the legend should be displayed along with the image or not. Different
colour legends map the scalar values (i.e. wind or reflectivity) to dif-
ferent colours.
21 of 76
centre of the circle. The radius of the circle depends on the elevation
angle; larger elevation angles give smaller circles, and vice versa.
Figure 6. A PPI image and the main graphical user interface in RAVE.
22 of 76
At lower altitudes, there is data only in the region closest to the centre
of the output image (which is straight above the radar), whereas at
higher altitudes, there is no data available at the centre of the output
image. There will be a “hole” with no data at the centre of the image
for higher altitudes, and this hole grows with the altitude. This is
understood by looking at Figure 7. There is a cappi image to the left
in Figure 8. For this cappi, generated for an altitude of 2 500 m., the
maximum range of the data is around 100 km.
1. And beyond the near and far boundaries, not seen in the 2D figure! Remember, in Figure
7, the generated image is orthogonal to the paper.
23 of 76
Figure 8. A CAPPI (left) and a PCAPPI (right) image, showing data from the
same location, time, and altitude (2 500 metres).
Figure 9. Generating a RHI image. The output image is parallel to the paper.
24 of 76
3.3.5 The VAD product
This abbreviation is for velocity/azimuth display. The vad technique
can be used to calculate the vertical wind profile at the radar site,
using data from a Doppler ppi scan. Measuring the Doppler velocity
along a circle at constant altitude, assuming the wind field is homoge-
neous, and plotting the radial wind velocity against the azimuth angle,
one gets a sine-function like shape. In Figure 9, three vad circles have
been drawn for the three distances from the radar: D1, D2, and D3.
These three distances obviously correspond to different altitudes and
ranges along the radar ray, as discussed in Section 2.7, “Range, height,
and distance”, on page 15. Data from one such circle is plotted in Fig-
ure 11. In the weather situation depicted in Figure 11, there was no
precipitation in the azimuth interval between 60 and 160 degrees, i.e.
north-east, east, and south-east from the radar.
D1 D2 D3
In Figure 11, the maximum value of a sine curve fitted to the points is
in the clear air interval, somewhere around 80 degrees azimuth. As
can be expected, the minimum value is about 180 degrees from there,
25 of 76
at some 260 degrees. This means that the winds are blowing from
around east direction (ene). The sine curve’s vertical offset from zero
is around -2.5 m/s, so the horizontal wind speed is around 7 m/s.
3.5
0.3
0 72.0 144. 0 216. 0 288. 0 360. 0
2.9
6.2
9. 4
The input to a vad algorithm is data from one or several (with differ-
ent elevation angles) Doppler ppi scans and the desired altitudes at
which vad circles should be generated. The output is either a graph
like the one in Figure 11 or a set of horizontal vectors, one for each of
the desired altitudes. Such an algorithm was already implemented in
rave before the start of this project, but the only possible way to view
the output from the algorithm was as lists of numbers. A 3D visuali-
zation object employing this algorithm has been added (see Figure
29).
26 of 76
Built upon Python, it is possible to call every rave method and func-
tion by issuing commands in a Python interpreter. New algorithms
can easily be added and tested and this openness is one of rave’s
greatest strengths. rave is a superiour tool to perform research within
weather radar analysis and visualization.
27 of 76
28 of 76
4
Visualization of
volumetric data
4.1 Introduction
This chapter aims to explain and compare the three main classes of
methods for visualization of volumetric data: slicing, surface render-
ing and volume rendering techniques.
1. Another common definition is to consider a voxel to be a cube with some small size
29 of 76
4.2 Polar and cartesian volumes
In this chapter, it is assumed that the data is stored in a cartesian
coordinate system, forming a rectilinear grid of data. However, as
seen in Chapter 2, the nature of radar data is originally not cartesian
but polar. Some products may be created directly from the polar vol-
umes, e.g. vertical cross-sectional planes are easily created using all
available data with one azimuth angle (see Section 3.3.4, “The RHI
product”, on page 24, and Section 5.4.4, “The RHI object”, on
page 54). Some products, e.g. isosurfaces, require that data be resam-
pled into a cartesian grid first, though.
F ( x, y, z ) = ∑ wi fi (Eq 21)
i=1
30 of 76
where n is the number of input points, fi are the values at the input
points, and wi are the weight functions assigned to each input point.
One common way to define the latter is:
–p
hi
w i = -----------------
n
- (Eq 22)
–p
∑ hj
j=1
2 2 2
hi = ( x – xi ) + ( y – y i ) + ( z – z i ) (Eq 23)
The first option is already implemented in rave, but only for planes
parallel to the earth’s surface. This and the two other options will
soon be explained further, but first we need to know how to calculate
the data value at an arbitrary point in the volume by using interpola-
tion. This is useful not only during slicing, surface extraction or vol-
31 of 76
ume rendering, but also when a polar data volume should be
resampled into a cartesian data volume, e.g. using Gaussian splatting
or Shepard’s method.
4.4 Interpolation
Voxels are usually addressed using three integer coordinates: x, y, and
z. But what if one needs to know the data value at, say, (8.8, 4.1, 4.0)?
32 of 76
V P = V A ⋅ ( 1 – x ) ( 1 – y ) ( 1 – z ) + V B ⋅ x ( 1 – y ) ( 1 – z ) + V C ⋅ ( 1 – x )y ( 1 – z ) (Eq 24)
+ V D ⋅ xy ( 1 – z ) + V E ⋅ ( 1 – x ) ( 1 – y )z + V F ⋅ x ( 1 – y )z + V G ⋅ ( 1 – x )yz
+ V H ⋅ xyz
In general, A will be at some location (xA, yA, zA) and H will be at (xp,
yp, zp). In this case, x in Equation 24 above would be replaced by (x -
xA)/(xH - xA) with similar substitutions made for y and z.
1. Interpolation may be regarded as convolution using a kernel, where the optimal “kernel”
is the infinite sinc function. Tricubic convolution is a common method. Here, cubic
splines in three dimensions are used to approximate the sinc function. See Danielsson et al
(2000).
33 of 76
tion may be used in order to calculate the values in the voxels. This
technique may also be used when the plane normal is not parallel to
one of the coordinate axes.
1. Please note, however, that natural clouds usually do not exhibit as sharp edges as this
approach may produce!
2. Especially when polygon rendering is accelerated by dedicated graphics hardware.
34 of 76
mines, from the arrangement of vertex values above or below the iso-
value, what the topology of an isosurface passing through this element
would be. The small volume elements are all cubes of the same size
and may contain one or more voxels. The values at the vertices (the
corners of the cubes) are calculated using some interpolation method.
Once the values at each vertex of an element have been calculated, the
second step of the algorithm is to look at each vertex value and deter-
mine if its scalar value is higher or lower than the isovalue of interest.
Each vertex is then assigned a binary value - e.g., 0 if value is lower, 1
if value is higher than the isovalue. In this case, the binary segmenta-
tion function B(v) is a thresholding function where the isovalue is the
threshold. If the eight vertices of an element are all classified as ones
or zeros, the element will have no isosurface passing through it - it
will either be completely inside or completely outside the surface. The
algorithm then moves on to the next element. Since the eight vertices
of each element are all assigned binary values, each element can be
classified as belonging to one of 28 = 256 cases. A table is predefined
to contain, for every one of these 256 cases, (a) how many triangles
will make up the isosurface segment passing through the cube, and (b)
which edges of the cube contain the triangle vertices, and in what
order. This is illustrated in Figure 15: here, the black dots represent
vertices that have been assigned 1 in the first step of the algorithm.
The triangles that constitute the isosurface in the volume element are
shown. The coordinates for the vertices of these triangles are calcu-
lated using linear interpolation; the value at every vertex should be
exactly the sought-for isovalue. By symmetry and rotational symme-
try the 256 possible cases reduce to the 15 shown in the figure.
1. There are, however, a few exceptional situations in which the isosurface polygons are dis-
continous across two adjacent volume elements
35 of 76
Figure 15. The marching cubes.
36 of 76
Volume rendering is an alternative approach which solves the men-
tioned problems. Transparency and colour values are assigned to all
voxels in the data set based on their scalar values, and then the entire
set may be viewed from any angle. Volume rendering can basically be
achieved using an object-order technique or an image-order technique.
37 of 76
The voxels can be looped through in a front-to-back order using a so-
called Z-buffer with as many data elements as there are pixels. The
first voxel is projected to pixel (x, y) in the image plane. At the same
time, the voxel’s distance to the image plane is stored at position (x, y)
in the Z-buffer. If another voxel projects to position (x, y), by looking
in the Z-buffer it is possible to determine that there is no need to
process it further and draw it since it would be hidden by the first
voxel. With a front-to-back method, once a pixel value is set, its value
remains unchanged.
38 of 76
Figure 17. Image-order volume rendering (inverse mapping). The image plane
is transformed and mapped onto the volume.
1. Please note that this is not the same thing as ray tracing, even though the two techniques
are similar. In ray tracing, each ray is followed when it bounces on shiny objects. Very
realistic scenes can be rendered using this technology.
39 of 76
shading calculations are performed (see Section 4.8 on page 41) and
the resulting colour is given to the pixel from which the ray originates.
1. Still, computer hardware performance is the major bottleneck of volume rendering as the
data sets to be visualized tend to grow.
40 of 76
Another approach is to follow the ray through the whole data volume,
storing in the image plane pixel the maximum value encountered
along the ray. This is called maximum intensity projection (mip) and by
using this technique, internal features of the data may be revealed that
would otherwise be hard to see.
4.8 Shading
When viewing any real object, a viewer subconsciously uses the infor-
mation provided by shades on the object in order to understand how
the object is shaped. This effect can be used with good results in 3D
visualizations on a computer screen; the shades help the user a great
deal in understanding the different shapes and relations of objects.
The Z-buffer may be used for shading - pixels that correspond to vox-
els far away from the image plane can be darkened, for example. This
technique is called depth cueing. More accurate shades are obtained by
passing the Z-buffer, which essentially is a 2D image, to a gradient-
shader. The gradient at each (x, y) pixel location is evaluated:
∂z
∂x
∇z = ∂z (Eq 25)
∂y
1
41 of 76
This estimates surface orientations; the gradient is parallel to the sur-
face normal. The partial derivatives can be approximated using back-
ward, forward or central differences; here is a central difference
approximation of the first component above:
∂z D ( x + 1, y ) – D ( x – 1, y )
≈ ------------------------------------------------------------ (Eq 26)
∂x 2
D(x, y) is the depth associated with the pixel at (x, y). When gradi-
ents, approximating surface normals, have been calculated for all pix-
els in the output image, the gradient shader takes these and the
distance(s) from the light source(s) into account to produce the final,
shaded image. Angles between the normals and vectors from the sur-
faces to the light source(s) affect the output shade. (Gallagher, 1995,
pp 178)
42 of 76
Using an illumination model as the one described briefly above, with
a few extensions, light intensities can be calculated for each vertex in
every triangle. Then, the light intensity can be linearly interpolated
over the triangle in order to make the shading appear softer, especially
for curved surfaces that have been approximated with polygons. This
technique is known as Gouraud shading, and it can be performed by
dedicated hardware in the 3D graphics accelerator.
43 of 76
44 of 76
5
Implementation
5.1 Visualization software
The last few decades computer power has developed at an incredible
pace; a decade ago, 3D scientific visualization was something that
could only be accomplished using super computers. Nowadays the
same task is feasible using only a standard pc - even the cheapest
graphics cards contain some 3D acceleration hardware. This evolu-
tion has lead to the development of quite a few visualization software
environments. Some of the most popular of those will be described
briefly and their eligibility for incorporation into rave will be dis-
cussed. Before that, we take a look at the structural units of a 3D vis-
ualization system.
45 of 76
Figure 19. Block diagram of a typical visualization system.
GRAPHICAL
USER
INTERFACE API
HIGH-LEVEL VISUALIZATION
ALGORITHMS SOFTWARE
3D GRAPHICS
API (OPENGL)
GRAPHICS OPERATING
CARD DRIVER SYSTEM
GRAPHICS
CARD
46 of 76
Using a visual programming system, little or no programming knowl-
edge is needed to create advanced visualizations of complex processes.
A screenshot from iris Explorer, which is representative for this cate-
gory of visualization systems, is shown in Figure 20.
AVS/Express
This is a software environment for visualization developed by
Advanced Visual Systems, avs. Development and research started in
1988. Avs claims that more commercial applications have been devel-
oped with Express than with any other visualization tool. Express
includes a comprehensive set of visualization modules. Express is
highly modular, hierarchical and extensible - it is possible to write
custom modules in c++ and include them in the system.
IRIS Explorer
iris Explorer was originally developed and distributed by Silicon
Graphics, Inc. (sgi). Interest in the system grew over the years and
sgi decided to move iris Explorer to an external independent soft-
ware house. In 1994 the Numerical Algorithms Group (nag) took
over the development, porting, marketing, sales and support. iris
Explorer runs on Linux, Unix and Windows platforms. Just like avs,
it is an extendable but expensive system that lacks support for Python.
(IEC - Iris Explorer Center)
47 of 76
Figure 20. IRIS Explorer user interface.
48 of 76
can be started up and communicated with using Python (Py-
OpenDX).
VIS5D
This free package supports rendering of higher dimension data sets.
Isosurfaces, volume renderings, coloured slices etc. are possible opera-
tions that can be performed on these data sets. The software runs on
several platforms and was originally developed to aid in the visualiza-
tion of numerical simulations of the earth’s atmosphere and oceans.
There is no support for Python. (Vis5D Home Page)
49 of 76
in a vast range of different fields. Not one of the other packages
described above or known to the author has all these advantages. The
choice seems simple.
"OLD"
RAVE
GRAPHICAL
USER INTERFACE
MAIN
GUI
CUTTING
PLANE TOPOGRAPHY ... ISOSURFACE
MANAGER MANAGER MANAGER
CUTTING ...
PLANE TOPOGRAPHY ISOSURFACE
50 of 76
polar volume. These are used by the visualization objects described
below.
For each new object, a tab is created in the main window in which
object specific properties, such as colour, opacity, or elevation angle,
can be altered through different buttons, menus and sliders. These
gui components along with event handlers are encapsulated in special
manager classes - one class for each kind of object. Each manager
object also has a reference to an actual object, e.g. an isosurface or a
cutting plane. These objects in turn have references to vtk objects
and contain object specific methods (e.g. getIsovalue in an isosur-
face object) used by the manager, typically in response to mouse
events. The vtk objects are collected in a rendering window (not
shown in Figure 21). A screenshot from the main gui is shown in
Figure 22. The gui is created through a Python interface, called
Tkinter, to a gui package called Tk. In the following, the different
objects are described briefly.
Figure 22. Main GUI screenshot when a few objects have been added.
51 of 76
5.4 Visualization objects
All visualization object classes are named raveVtkObjectName, e.g.
raveVtkIsosurface, and they all inherit some basic properties and
methods from an abstract class simply called raveVtkObject. This
class includes methods to get and set object description, position, vis-
ibility, opacity, and colouring, among a few other generic operations.
All objects get their input from the single raveVtkMain instance.
For each radar there is also a binary land/water map with the same
size and resolution as the elevation map. This map is used to colour
the three-dimensional surface blue where there is water. Where there
is land, the colour depends on the height above sea level.
52 of 76
The radar object is simply a model of a radome with a foundation,
shown at the position of the real radar.
Figure 23. Topography, radar, bounding box, and height axis objects.
53 of 76
attribute values, e.g. scalars and/or vectors, are also represented. In
this case, a scalar value for reflectivity is associated to each point and
this is used for colouring. Colours are interpolated, by the graphics
hardware, between the vertices. The surface is not discontinuous
where there is clear air (zero reflectivity return).
The manager of this object, shown as the currently selected object tab
in Figure 22 on page 51, contains controls to select what elevation
angle and what azimuth angles (start and range) to use. Note that the
azimuth span does not have to be 360 degrees - a smaller slice, or sec-
tor, can be shown instead of the whole revolution. These objects can
be made more or less transparent using an opacity slider.
54 of 76
Figure 25. A range/height indicator, or rhi, object, and its manager.
The elevation surface displays data for a single elevation and different
azimuth angles. This object performs the complementary operation:
it displays data for a single azimuth and different elevation angles.
The very same technique is used to generate the actual polygon data:
triangle strips are created using three neighbouring points to form
each triangle.
The manager for this type of objects contains controls to select what
azimuth angle to use and whether or not the rhi 180 degrees from
this azimuth angle should be disblayed together with it. As with most
other objects, opacity can be controlled using a slider.
55 of 76
Figure 26. A semitransparent CAPPI object and its manager.
Using controls in the manager, the look of the glyphs can be changed.
Currently, the user can choose from using vertical or horizontal lines
or 3D crosses, but new glyphs are easily added. Upper and lower
thresholds for the glyphs can be set; glyphs are drawn only in those
points where the reflectivity value is between the thresholds. It is also
possible to choose to display every n:th glyph, where n can be set
using a slider.
Some glyphs are shown in Figure 27. A still image like this one (espe-
cially when printed!) is difficult to interpret. The user has to rotate
and pan the image in order to get an idea of the arrangement of sam-
ples. This goes for most of the objects.
56 of 76
Figure 27. A glyphs object.
57 of 76
Figure 28. A winds object, as seen from above.
58 of 76
displayed simultaneously. The opacity of an isosurface can be set
using a slider - otherwise only the surface with the lowest isovalue
would be visible. Isosurfaces may look something like the ones in Fig-
ure 30. Here, two isosurfaces are shown, the inner one (brighter) has
an isovalue of 18.5 dBZ, and the outer one is at 14.5 dBZ and has its
opacity set to 0.3 in order to reveal the inner one.
Figure 30. Two isosurface objects, with isovalues 14.5 and 18.5 dBZ for
outer and inner surface, respectively.
59 of 76
breakpoints. The user can add breakpoints by clicking at the position
where the points should be added. Once added, a point can be
dragged or its colour can be changed by first selecting the point and
then clicking the “Select colour” button. This brings up a colour
chooser.
60 of 76
Figure 32. A volume rendering object.
61 of 76
62 of 76
6
Evaluation and
conclusions
6.1 Introduction
In the beginning of this project it was very unclear what the final
result would be like. Little research has been made within the field of
3D visualization of radar data so it was difficult to know what to
expect. Thus, different ideas were discussed, different kinds of objects
were tested in isolation, and the ones that were judged as useful have
been included in the application. Informal evaluations have been per-
formed during the development and these have gradually set direc-
tions for the following design. In this chapter, some of the thoughts
and opinions that have arisen during discussions with researchers and
meteorologists are presented, along with some conclusions that can be
drawn from the work that this paper has presented. Areas of future
research and/or extensions are also here.
6.2 Evaluation
In this section, the application’s usefulness and performace are dis-
cussed.
6.2.1 Usefulness
Apart from the use as a research tool for the analysis of exceptional
weather situations, one of the areas where this application would be
useful is for weather forecasting around airports. At the end of this
project, the application will be installed at Arlanda airport, outside
Stockholm. Weather radars are used extensively around airports as
one of the most important decision-making tools when creating local,
short-term forecasts. Both heavy precipitation and strong winds heav-
ily affect the traffic around airports; as we have seen, both may be
63 of 76
analyzed using radar technology. The application has been demon-
strated for a meteorologist working at Arlanda and she believed that it
offers the possibility to extract features from the current weather situ-
ation that are very difficult, if not impossible, to see using the existing
2D visualization systems. Some of her thoughts about a few objects
are presented below. Her main thoughts on the application were very
enthusiastic, and this has lead to the installation of it at her work-
place.
Range/height indicators
The height of precipitation clouds is important information when air-
craft should be routed around airports. This includes the altitude of
the clouds’ precipitation kernels and their thicknesses. The range/
height indicator is a common 2D product in radar visualization sys-
tems used to investigate just that. In this implemetation, the main
strength not offered by plain 2D systems lies in the ability to relate
the rhi geographically to topography, and to other objects (including
other rhis) in an intuitive way. In the case of plain 2D systems, the
user has to create a 3D environment him/herself and this is not easy!
Using 3D technology, the computer takes over the time-consuming,
difficult task of building the 3D environment.
Isosurfaces
This is an object that takes some time to get used to, as it is a true 3D
object that has no equivalent in 2D visualization of radar data1. It is
useful in detecting and displaying the kernels, the areas of the most
dense precipitation, in clouds. By slowly decreasing the isovalue, the
isosurface expands and it is possible to see the extent of different pre-
cipitation densities.
For users at smhi and Arlanda, this is a completely new way of exam-
ining weather radar data and the users are enthusiastic.
Radar rays
The radar ray object displays a ray in the 3D view and it is possible to
display a plot to show how the reflectivity varies along the ray. Choos-
1. The 2D equivalent of isosurfaces is the isoline. This is not used in radar visualization,
though.
64 of 76
ing elevation and azimuth angles so that the ray corresponds to a path
followed by a landing aircraft or one taking off, it is possible to see
what the aeroplane has to go through in terms of reflectivities.
Volume rendering
In a volume rendering, the user basically has the possibility to view
the whole echo picture at once. The big problem here is the speed; it
just takes too long to render an image for the object to be useful.
When this problem is overcome, there will probably be great use for
it. In the meantime, the glyphs object can be used instead to provide
similar products.
6.2.2 Performance
The application performs well, mainly thanks to vtk being a well-
written c++ library, and to hardware accelerated 3D graphics render-
ers. The 3D graphics are rendered at interactive speeds (at least ten
frames per second) in windows that are some 1 000 x 1 000 pixels in
size when a modern, though not state-of-the-art, graphics card is
used. The application has been tested with the two configurations
shown in Table 2.
System A System B
Processor 500 MHz Intel Pentium 3 600 MHz AMD K7
65 of 76
when using System B; rendering speeds are at least two times the ones
achieved with System A.
There are two cases when the application shows significant processing
delays: the first one has been mentioned several times already, and it
has to do with volume rendering. The calculations are performed
completely by the cpu; there is no dedicated hardware for this1. Using
the systems above, there is however a notable difference between Sys-
tem A and System B (i.e. the latter is faster and performs better). As
of this writing, a new computer has a clock speed of around 1.5 GHz
and this difference would probably be significant. This exciting test
remains to be executed.
66 of 76
to manually and automatically step forward and backward in the
sequence.
• Saving options. The user should be able to save settings; this may
include saving camera and object settings to a file, but it may also
include the possibility to store camera positions temporarily during
a session, i.e. “camera bookmarks”. The user should also be able to
save the current view as a still image.
• Composites. The user should be able to view data from several
radars simultaneously.
6.4 Conclusions
The purpose of this work, as stated in Section 1.2 on page 1, was “to
design and implement a generic, object-oriented application for 3D
visualization of weather radar data”. The application developed dur-
ing this project is a complete, generic, ready-to-use tool for 3D visual-
ization and analysis of weather radar data.
67 of 76
68 of 76
7
Resources
7.1 Books and articles
Bolin, Håkan & Daniel B. Michelson: The Radar Analysis and Visual-
ization Environment: RAVE. Preprints, American Meteorological
Society: 28th Conference on Radar Meteorology (1997), pp 228-229.
69 of 76
Schroeder, Will et al: The Visualization Toolkit User’s Guide. Kitware,
Inc., 2001.
7.2 Internet
Advanced Visual Systems: Advanced Visual Systems: Software. http://
www.avs.com/software/soft_t/avsxps.html. Visited 2001-09-11.
70 of 76
Appendix A
Class diagrams
A.1 Introduction
An overview of the basic design of the application developed during
this project was given in Section 5.3 on page 50. As described in that
section, there are two basic subsystems: the graphical user interface
and the visualization objects. In this appendix there are two class dia-
grams describing the relations, methods and parameters of the differ-
ent classes within these two subsystems.
All classes developed during this project are written in the Python
language. The software runs on Linux but could easily be adapted to
run on Unix platforms as well.
71 of 76
Figure 34. Graphical user interface class diagram.
Tk.Toplevel Transform3D_GUIPlugin
parent
_vtkMain
raveVtkMain
+tabs: Pmw.NoteBook
+showBoxCheck: Tk.Checkbutton
+showAxisCheck: Tk.Checkbutton
+showRadarCheck: Tk.Checkbutton
+showHistogramCheck: Tk.Checkbutton
+addActorOption: Pmw.OptionMenu
+addActorButton: Tk.Button
+getPolyData(voltype:int,azimstep:int,binstep:int): vtkPolyData
+getStructuredPoints(voltype:int,wl:int,h:int,null:int): vtkStructPts
+openDataVolumes()
+initRenderingFrame()
+openVolume(type:int): boolean
+openTopography(): boolean
+openLandWater(): boolean
+getVolume(type:int): rave.pvol
+getTopography(): rave.image
+getLandWater(): rave.image
+hasVolume(type:int): boolean Tk.Toplevel
+hasTopography(): boolean
+hasLandWater(): boolean
+getVolumeInfo(voltype:int): pvol.info
+getAreaId(pvol:rave.pvol): string
+getVads(distance:float): rave.series
+getHistogram(voltype:int): Histogram
+getSlopeAndOrd(voltype:int): float[2]
+getScalarRange(type:int): float[] raveVtkRenderingFrame
+getLookupTable(voltype:int): vtkLookupTable +addActor(actor:vtkProp)
+getBinsAtElevations(voltype:int): list[float[]] +removeActor(actor:vtkProp)
+getAzimuths(voltype:int): float[] +render()
+getElevations(format:string): float[]
+getClosestElevation(elev:float,voltype:int): float[2]
+getAzimuthResolution(voltype:int): float
+getRangeHeightDistance(elev:float,dplr:boolean): float[][]
+getBoundingBoxPoint(pos:string,km:boolean,dplr:boolean): float[3]
+getBoundingBoxCenter(km:boolean,doppler:boolean): float[3]
+addObject(c:classRef) raveVtkMainInitThread
+removeObject(desciption:string) +run()
+getObjectsOfType(type:string): raveVtkObject[]
+render()
+chooseColor(): int[3]
+displayErrorDialog(message:string)
main
raveVtkPlotUtilities raveVtkManager
+setDecorationColor(col:float[3]) +showCheck: Tk.Checkbutton
+opacitySlider (kw): Tk.Scale
+setPlotColor(col:float[3]) +removeButton (kw): Tk.Button
+getX(x:float): float
+chooseColorButton (kw): Tk.Button
raveVtkObject
+getY(y:float): float
+lowerThresholdSlider (kw): Tk.Scale raveVtkXYPlotCanvas
+getXvalue(x:float): float
+getYvalue(y:float): float +upperThresholdSlider (kw): Tk.Scale
+setXvalues(x:float[])
+savePlot(plot:Tk.Canvas) +getDescription(): string +setYvalues(y:float[])
+getObject(): raveVtkObject vtkActor
+objectCreatedSuccessfully(): boolean
When the instance variables have been set, some files are opened:
polar volumes with reflectivity, Doppler reflectivity, and wind values;
and elevation and land/water maps for the topography. A thread is
72 of 76
created and started (instance of raveVtkMainInitThread) which
prepares some common data: vtkPolyData and histograms, for
instance. The prepared data is handed over to the raveVtkMain
instance, and this initialization thread destroys itself.
Then the actual graphical user interface is created: buttons and labels
are added to the window, as well as a tabbed pane to contain the man-
agers that are added with each new object. Event callbacks are associ-
ated with every clickable item. The definitions of the event callbacks
are in the raveVtkMain class itself or in the manager class to which
their corresponding widgets belong (they are not included in the class
diagram, though). Their main function is usually to pass on the call to
some other method (e.g. addObjectClicked() calls addOb-
ject() with the appropriate arguments).
Once the rendering frame has been created, a few objects - topogra-
phy, radar, bounding box, and axis - are created. When the topogra-
phy object is created, a topography manager object (instance of
raveVtkTopographyManager) is created and a new tab is added to
the tabbed pane in the raveVtkMain window.
A.2.1 Managers
There is a manager class for each visualization object class. The man-
ager is responsible for creating widgets in the raveVtkMain
73 of 76
instance’s tabbed pane. A reference to a newly created tab is passed to
the manager’s constructor when this is called from raveVtkMain.
The constructor initializes instance variables and widgets.
1. Actually, there is no support for abstract classes in Python. It is always possible to create
instances of every class, but in the case of raveVtkManager, it would make no
sense to do so - in its own, it is useless!
74 of 76
A.3 Visualization objects
This is the second group of classes and the class diagram is shown in
Figure 35. Again, there is an “abstract” superclass of all objects, sim-
ply called raveVtkObject, that contains some attributes and meth-
ods that are common to all concrete subclasses. These include, for
instance, the setPosition() and setColor() methods.
All object classes serve as containers for some vtk objects. These
objects are connected to form a visualization pipeline for each differ-
ent type of object. The pipeline typically starts with data from
raveVtkMain’s getPolyData() or getStructuredPoints()
methods, and it always ends with an instance of vtkActor, or one of
its subclasses, and one of vtkMapper’s subclasses.
75 of 76
One of the simplest possible visualization pipelines can be found in
the raveVtkIsosurface object. It starts with the output from
raveVtkMain.getPolyData(), adds a vtkMarchingCubes
object and a mapper and actor. When the object has been created, the
isovalue can be set using the raveVtkIsosurface.setIso-
value() method, and this is the method that is called when the user
adjusts the isovalue using the isovalue slider in the manager. This
method, in turn, calls the SetValue() method of the vtkMarch-
ingCubes instance. All objects basically operate this way: the man-
agers issue calls to the objects due to user interactions with the
different sliders, menus, and buttons, and the objects pass the calls on
to some underlying vtk object. Finally, a rerendering is requested
from the vtkTkRenderwidget instance in order to reflect the
changes.
76 of 76
På svenska
Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare – under en
längre tid från publiceringsdatum under förutsättning att inga extra-ordinära omstän-
digheter uppstår.
Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut
enstaka kopior för enskilt bruk och att använda det oförändrat för ickekommersiell for-
skning och för undervisning. Överföring av upphovsrätten vid en senare tidpunkt kan
inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsman-
nens medgivande. För att garantera äktheten, säkerheten och tillgängligheten finns det
lösningar av teknisk och administrativ art.
Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den
omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt
samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sam-
manhang som är kränkande för upphovsmannens litterära eller konstnärliga anseende
eller egenart.
För ytterligare information om Linköping University Electronic Press se förlagets
hemsida http://www.ep.liu.se/
In English
The publishers will keep this document online on the Internet - or its possible replace-
ment - for a considerable time from the date of publication barring exceptional circum-
stances.
The online availability of the document implies a permanent permission for anyone
to read, to download, to print out single copies for your own use and to use it
unchanged for any non-commercial research and educational purpose. Subsequent
transfers of copyright cannot revoke this permission. All other uses of the document are
conditional on the consent of the copyright owner. The publisher has taken technical
and administrative measures to assure authenticity, security and accessibility.
According to intellectual property law the author has the right to be mentioned
when his/her work is accessed as described above and to be protected against infringe-
ment.
For additional information about the Linköping University Electronic Press and its
procedures for publication and for assurance of document integrity, please refer to its
WWW home page: http://www.ep.liu.se/
© Aron Ernvik