Geospatial 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 327

I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Basic Principles of Remote


Sensing

Shashi Kumar
Scientist ‘SD’
shashi@iirs.gov.in

1
8/11/2015
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Remote Sensing is the art, science and


technology of observing an object scene, or
phenomenon by instrument-based techniques.
 Remote: because observation is done at a
distance without physical contact with the object of
interest
 Sensing: Detection of energy, such as light or
another form of electromagnetic energy

 Measurement from a
distance
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Definitions
 Remote Sensing is the science of acquiring, processing
and interpreting images that record the interaction
between electromagnetic energy and matter. (Sabins,
1996)

 The term Remote Sensing means the sensing of the


Earth's surface from space by making use of the
properties of electromagnetic waves emitted, reflected
by the sensed objects, for the purpose of improving
natural resources management, land use and the
protection of the environment. (UN, 1999)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Process


Energy Source Sensor

A
D

B Application

F G

Processing
Station Analysis
Target C
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Process


The process in remote sensing involves an interaction
between incident radiation and the targets of
interest.
The following seven elements are involved in this
process:
 Energy Source or Illumination (A) - the first
requirement for remote sensing is to have an energy
source which illuminates or provides electromagnetic
energy to the target of interest.
 Radiation and the Atmosphere (B) - as the energy
travels from its source to the target, it will come in
contact with and interact with the atmosphere it passes
through. This interaction may take place a second time as
the energy travels from the target to the sensor.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Process


 Interaction with the Target (C) - once the energy
makes its way to the target through the atmosphere,
it interacts with the target depending on the
properties of both the target and the radiation.
 Recording of Energy by the Sensor (D) - after the
energy has been scattered by, or emitted from the
target, we require a sensor (remote - not in contact
with the target) to collect and record the
electromagnetic radiation.
 Transmission, Reception, and Processing (E) -
the energy recorded by the sensor has to be
transmitted, often in electronic form, to a receiving
and processing station where the data are processed
into an image (hardcopy and/or digital).
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Process

 Interpretation and Analysis (F) - the processed


image is interpreted, visually and/or digitally, to
extract information about the target which was
illuminated.
 Application (G) - the final element of the remote
sensing process is achieved when we apply the
information we have been able to extract from the
imagery about the target in order to better understand
it, reveal some new information, or assist in solving a
particular problem.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing
 Geospatial data acquisition (GDA):
Collection, processing and analysis of data
for various purposes:
 Water management
 Land management
 Resource management, etc.

 Data : representations that can be


manipulated by a computer
 Information : interpreted data
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote sensing
platforms

Satellite-based Airplane-based

Ground-based
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Sensors


 Passive sensors : collect electromagnetic
radiation in the visible and infra-red part of
the spectrum:
 AerialPhotographs
 Low resolution: Landsat, ASTER, SPOT, IRS
 High Resolution: Quickbird, IKONOS

 Active sensors : generate their own radiation:


 Air-borne RADAR
 Space borne RADAR: RISAT-1, RADARSAT
 Lidar (laser scanner)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Four types of resolution

 Spatial resolution

 Spectral resolution

 Radiometric resolution

 Temporal resolution
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IKONOS IMAGE
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IKONOS IMAGE OF DEHRADUN


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Advantages of remote sensing


1. Global coverage 2. Synoptic view

3. Repeatability 4. Cost
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

PHYSICS OF REMOTE
SENSING
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Remote Sensing relies on the measurement of


ElectroMagnetic (EM) energy.
 The most important source of EM energy is the sun
Some sensors detect energy emitted by the Earth itself or
provide their own energy (Radar)

 EM energy can be modelled by (1) waves or (2)


energy bearing particles called photons
 The two descriptions are not really contradictory.
 The energy is emitted as photons, but its statistical
distribution over time is described by a wave.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Electromagnetic Waves
 EM waves are produced by motion of electric charge
components and consists of
 Electrical Field (E) which varies in magnitude in a
direction perpendicular to the direction in which the
radiation is traveling, and a
 Magnetic Field (M) oriented at right angles to the
electrical field. Both these fields travel at the speed of
light (c). E

C
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Wavelength & frequency


 Wavelength is the length of one wave cycle, which can be
measured as the distance between successive wave crests.
 Wavelength is usually represented by the Greek letter lambda
().
 Frequency
 It refers to the number of cycles of a wave passing a fixed
point per unit of time.
 Frequency is normally measured in hertz (Hz), equivalent to
one cycle per second, and various multiples of hertz.
Wavelength and frequency are related by the
following formula
Frequency, ν = c/λ

where, λ = wavelength
c = speed of light
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Electromagnetic Spectrum
 The electromagnetic spectrum ranges from the shorter
wavelengths (including gamma and x-rays) to the longer
wavelengths (including microwaves and broadcast radio
waves).

 There are several regions of the electromagnetic spectrum


which are useful for remote sensing.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Visible range

The visible wavelengths cover


a range from approximately
0.4 to 0.7 m. The longest
visible wavelength is red and
the shortest is violet.

Violet: 0.4 - 0.446 m


Blue: 0.446 - 0.500 m
Green: 0.500 - 0.578 m
Yellow: 0.578 - 0.592 m
Orange: 0.592 - 0.620 m
Red: 0.620 - 0.7 m
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Applications of Visible Bands

► Visible Blue Band (.45-.52 microns)


► Visible Green Band (.52-.60 microns)
► Visible Red Band (.63-.69 microns)
► Panchromatic Bands (.50-.90 microns)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Visible Blue Band (0.45-0.52 Micrometers)

► Greatest water penetration


► Greatest atmospheric scattering
► Greatest absorption
► Used for : water depth
water characteristics
detection of subsurface features
soil and vegetation discrimination
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Visible Green Band (0.52-0.60 Micrometers)

► Vegetation discrimination
► Urban Infrastructure
► Less affected by atmospheric scattering
► Chlorophyll Concentration
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Visible Red Band (0.63-0.69 Micrometers)

► Chlorophyll absorption band of healthy green


vegetation.
► Vegetation type
► Plant condition
► Least affected by atmospheric scattering
► Less water penetration but good near surface
information ie. Water quality, sediment, and
chlorophyll.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Panchromatic Bands
(0.50-0.90 Micrometers)

► Wide range of sensitivity


► Visible to Near IR
► Higher spatial resolution
► Can be combined with other multi-spectral
bands.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Infrared range
 Infrared region covers the
wavelength range from
approximately 0.7 m to 100
m –
 The infrared region can be
divided into two categories
based on their radiation
properties - the reflected IR,
and the emitted or thermal IR.
 The reflected IR covers
wavelengths from approximately
0.7 m to 3.0 m.
 The thermal IR covers
wavelengths from approximately
3.0 m to 100 m.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Thermal IR Remote Sensing


 Thermal infrared radiation refers to
electromagnetic waves with a wavelength of
between 3 and 20 micrometers.

 Most remote sensing applications make use of


the 3 to 5 and 8 to 14 micrometer range (due
to absorption bands).

 The main difference between thermal infrared


and near infrared is that thermal infrared is
emitted energy, whereas the near infrared is
reflected energy, similar to visible light.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Microwave Remote Sensing


 The portion of the spectrum
of more recent interest to
remote sensing is the
microwave region from about
1 mm to 1 m.

 Longer wavelength
microwave radiation can
penetrate though cloud, fog,
haze etc as the longer
wavelengths are not
susceptible to atmospheric
scattering which affects
shorter optical wavelengths

 Active Microwave has Night


Vision Capability
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

L-Band ALOS PALSAR


L-Band ALOS PALSAR
data for Dehradun
data for Dudhwa National
Park
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Pink and White- Even


(Double) bounce
scattering due to urban
features

Green :- Volume
scattering from forest
due to multiple reflection

Blue:- Dark blue


Surface scattering

RISAT-1, Hybrid-
Pol Data of
Dehradun
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Planck’s law
It describes the amount of electromagnetic energy with a
certain wavelength radiated by a black body in thermal
equilibrium (i.e. the spectral radiance of a black body).

Mλ = c1 λ-5 [exp (C2/ λT) – 1] -1


Where, c1 (3.74x10-16Wm2) and c2(1.44x10-2moK)are
constant,  is the wavelength and T is the absolute
temperature.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Stefan’s Boltzman’s Law

 The total energy M within all the wavelength can be


found out by integrating the Planck’s equation from
=0 to  =
 Total Energy radiated by blackbody is

M=σT4
Where
M = total radiant exitance from the surface of a
material (W m-2 )  = 5.67 x 10-8 W/m2 (oK)4 (Stefan
Boltzman Constant) T = Absolute Temperature (K) of a
radiating body
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Wein’s Displacement Law


 The spectral distribution of energy varies also with
temperature
 The dominant wavelength at which a blackbody
radiation curve reached a maximum is related to
temperature by Wein’s Law:

where,

Hotter the object the shorter the wavelengths of


the maximum intensity emitted
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Specular Vs Diffused Reflections

 Specular or mirror-like reflection occurs when


the surface is smooth
 Diffuse reflection occurs when the surface is rough
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Criterion of Roughness
Rayleigh criterion considers a surface to be smooth if
h <  / 8 sin 
where, h = height of surface irregularities
 = wavelength
 = incident angle
and defines the rough surface if
h >  / 8 sin 
Peake and Oliver's modified Rayleigh criterion defines
the smooth, rough as well as intermediate surfaces
h <  / 25 sin  - smooth
h >  / 4.4 sin  - rough
 / 4.4 sin  < h >  / 25 Sin -intermediate surfaces
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Peake and Oliver's modified Rayleigh


criterion
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Reflectance
 Spectral reflectance, (()), is the ratio of reflected energy
to incident energy as a function of wavelength.
 The reflectance characteristics of the earth’s surface
features are expressed by spectral reflectance, which is
given by:
() = ( R() / I() ) x 100

Where, () = Spectral reflectance at a particular wavelength.


R() = Energy of wavelength reflected from object
I()= Energy of wavelength incident upon the object
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Reflectance Curve


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Atmospheric Scattering
This occurs when the particles of gaseous molecules
present in the atmosphere cause the EM waves to be
redirected from the original path

 Raleigh scattering : size atmospheric particles < than


the wavelengths of incoming radiation
 Mie scattering : size atmospheric particles ~ than the
wavelengths of incoming radiation
 Non-selective scattering : size atmospheric particles >
than the wavelengths of incoming radiation
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Atmospheric Windows
 Gases absorb electromagnetic energy in very specific
regions of the spectrum
 Those areas of the spectrum which are not severely
influenced by atmospheric absorption and thus, are useful
to remote sensors, are called atmospheric windows
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Atmospheric Windows
Name Wavelength Range
Ultraviolet–Visible 0.30 – 0.75 m
Near-IR 0.77 – 0.90 m
Short-wave-IR 1.00 – 1.12, 1.19 – 1.34, 1.55 –
1.75,2.05 – 2.40 m

Thermal IR 3.50 – 4.16, 4.50 – 5.00, 8.00 –


9.20,10.20 – 12.40, 17.0 – 22. 00
m

Microwave
2.06 – 2.22, 7.50 – 11. 50 and
> 20.00 mm
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Contact Details of the Faculty:

Email- shashi@iirs.gov.in
Tel- 0135-252-4119
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Earth Observation
Sensors and Platforms

Vinay Kumar
Scientist, PRSD
vinaykumar@iirs.gov.in
8/13/2015
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Remote Sensing Process


Energy Source Sensor SatCom

Application

Target
Ground receiving &
Processing Station Analysis
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Sensors and platforms that are used to


create image data of the Earth

Sensors= a device that records EM Energy


Platforms= carrier bed used to carry a sensor
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

History of remote sensing


 1827 - first photograph
 1858 - first aerial photograph from a hot air balloon
 1861-1865 - Balloon photography used in American Civil War
 1888 – ‘rocket’ cameras
 1903 - pigeon-mounted camera patented
 1906 - photograph from a kite
 1914-1945 - Plane mounted Cameras WWI, WWII
 1956 - U2 spy planes
 1957 - Sputnik-1
 1960 - 1st meteorological satellite ‘TIROS-1’ launched
 1967 - NASA ‘Earth Resource Technology Satellite’ programme
 1972 - ERTS (Landsat) 1 launched...
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Ground based
 Airborne
 Spaceborne
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Used to record detailed information


about the surface which is compared
with information collected from
aircraft or satellite sensors.

• In some cases, this can be used to


better characterize the target which
is being imaged by the other sensors,
making it possible to better
understand the information in the
imagery.

• Sensors may be placed on a ladder,


scaffolding, tall building, crane, etc.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

•Are primarily stable wing


aircraft, although
helicopters are occasionally
used.

•To collect very detailed


images and facilitate the
collection of data over any
portion of the Earth's
surface at any time.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Space remote sensing is sometimes conducted from the


-Space Shuttle
-Satellites (more commonly)
 Satellites are objects which revolve around another object -
in this case, the Earth.
 e.g: the moon is a natural satellite, whereas man-made
satellites include those platforms launched for remote
sensing, communication, and telemetry (location and
navigation) purposes.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 The path followed by the satellite is


called orbit.
 The satellite moves as per Kepler’s law.

1st Law: The path followed by each


planet is an ellipse with sun at one
FOCI.
2nd Law: The line joining to the planet to
sun sweeps out equal areas in equal
times.
3rd Law: The square of the period of the
planet is proportional to the cube of the T² α a³
semi major axis.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Altitude
Orbit of
 Inclination angle Satellite
 Period
Inclination
 Repeat Cycle
 Swath Equatorial
 Ascending pass & Plane
Descending pass Earth
 Perigee
 Apogee
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Altitude ~700-800 km
• Altitude ~ 36,000 km,
• Orbit inclination ~ 98.7º
• Orbit inclination ~ 0°
• Orbital period ~90 minutes
• Period of orbit = 24 hours
• Sun-synchronous, near-polar, near-
• Global coverage requires several
circular
geostationary satellite in orbits at
• Satellite orbit is fixed in space (basically
different latitudes
north-south ): Earth rotates beneath it
• Good for repetitive observations,
(west-east)
poor for spatially detailed data
• Cross the equator (N-S) at ~10.30am
• Large distortions at high latitudes
local time
• W-E satellite orbiting Earth
• Satellite Orbital plane is near polar and
• Mainly used for communication and
the altitude is such that the satellite
meteorological applications – GOES,
passes each place at same local sun-time.
METEOSAT, INSAT etc.
• Cover entire globe – LANDSAT, SPOT,
NOAA, IRS etc.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

The satellite's orbit (North –South) and the rotation of the


Earth (from west to east) work together to allow complete
coverage of the Earth's surface, after it has completed one
complete cycle of orbits
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Passive Active

Optical Remote Sensing Microwave Remote Sensing


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Passive Sensors Passive Sensors


1. Photographic Camera 1. Spectro-radiometers
2. The Optical Scanners
a) Across Track Scanners
b) Along Track Scanners Active Sensors
3. The Thermal Scanner 1. Laser Distance Meter
2. Laser Water Depth Meter
3. Microwave Altimeter
Active Sensors
1. RADAR (Radio Detection and
Ranging)
2. LiDAR (Light Detection and
Ranging)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Whisk broom scanning


• Scan the Earth in a series of lines.
• The lines are oriented
perpendicular to the direction of
motion of the sensor platform
(i.e. across the swath).
• Data are collected within an arc
below the system typically of A
some 90º to 120º
C
• Multispectral scanner (MSS) and E B
thematic mapper (TM) of
LANDSAT, and Advanced Very
High Resolution Radiometer
D
(AVHRR) of NOAA are the
examples of Whisk Broom F
scanners
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Push broom scanning


 Scan the Earth in a series of lines.
 This also use the forward motion of the
platform to record successive scan lines
and build up a two-dimensional image,
perpendicular to the flight direction.
 Linear arrays normally consist of numerous
charge-coupled devices (CCDs)
positioned end to end.
 Linear imaging self scanning (LISS) and
Wide Fielf Sensor (WiFS) of IRS Series,
and High Resolution Visible (HRV) of
SPOT-1 are the examples of Push broom
scanners
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Resolution
Ability of the system to render the information at the
smallest discretely separable quantity in terms of
distance (spatial), wavelength band of EMR (spectral),
time (temporal) and radiation (radiometric)

The Four Resolutions of Remote Sensing


• Spectral
• Spatial
• Temporal
• Radiometric
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Resolution
•Spectral resolution describes
the ability of a sensor to define
fine wavelength intervals.

•This refers to the number of


bands in the spectrum in
which the instrument can take
measurements.

•Higher spectral resolution =


better ability to exploit
• panchromatic
differences in spectral • multispectral
signatures • hyperspectral
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

True Color and False Color Image


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spatial Resolution
 The physical dimension on earth is
recorded
Pixel = detector size
 It refers to the amount of detail that can
CCD Linear Array
be detected by a sensor.
 Detailed mapping of land use practices
requires a much greater spatial Lens
resolution
Field of view IFOV
Instantaneous Field of View (IFOV)
It is defined the solid angle through which a
detector is sensitive to radiation.
ground pixel
IFOV = D/F radian

GRE = IFOV x H

where, GRE=Ground Resolution Element


D=detector dimension, Flight Direction
F=focal length, and
H=flying height
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Desirable Spatial Resolution


Cloud patterns, movement 1-2 Kms.
Meteorology
Water vapor Analysis 8 Kms.
Ocean Color Monitoring
(Chlorophyll, Sediment
Oceanography Map, Yellow Substance, 300-1100 m
Sea Surface Temp.
Mapping)
Crop monitoring,
Forest Mapping, 20-30 m
Hydrology etc.
Land use
Cartography, Urban 2-6 m
Planning

Military Surveillance  1m
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Radiometric Resolution
• It describes the actual information content in an image.

• Sensitivity to the magnitude of the electromagnetic energy


determines the radiometric resolution.

• The radiometric resolution of an imaging system describes


its ability to discriminate very slight differences in energy.

• The finer the radiometric resolution of a sensor, the more


sensitive it is to detecting small differences in reflected or
emitted energy.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Radiometric Resolution
2 (number of bits) = number of grey levels

bits Grey Levels range (b-w)


1 2 0-1
2 4 0-3
3 8 0-7
4 16 0-15
5 32 0-31
6 64 0-63
7 128 0-127
8 256 0-255
9 512 0-511
10 1024 0-1203
256
16
2 colors
colors
colors
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Temporal Resolution
Represents the frequency with which a satellite can
re-visit an area of interest and acquire a new image.

Depends on the instrument's field of vision, and the


satellite's orbit

Application demand
Meteorological  hourly need to monitor clouds
Oceanographic  2-3 days of repetivity
Stereo viewing  0-1 days of repetivity
Vegetation monitoring  5 days of repetivity
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Various Earth Observation Satellites


Europe MIDDLE North America Asia
EAST
France ESA Israel USA Canada India Japan
SPOT1-86 LANDSAT5-
10m 85 30m
SPOT2-90 ERS1-92/00 LANDSAT6-
10m radar 93
SPOT3-93/96 ERS2-95 EARLYBIRD- IKONOS1-99 RADARSAT- IRS1C-95
radar 98 1m 95 6m

SPOT4-98 10m ENVISAT- LANDSAT7- IKONOS2-99 IRS1D-97 6m


2001 99 15m 1m
Radar
EROS A/1- QUICKBIRD- ORBVIEW- IRS P6-2003
00 2m 01 0.6m 01 1m 5.8m
SPOT5-02 3m+HRS10 EROS B/1- ORBVIEW- RADARSAT- CARTOSAT1- ALOS
02 1m 02 1m 03 2.5m 2.5m

SPOT6-2012 Landsat8-2013 CARTOSAT2-


SPOT7-2014 80cm

Distribution
SPOT IMAGING Miscellaneous Imagesat SI-EOSAT, Earthwatch, RADARSAT NRSC-EOSAT Jaxa
Orbimage, USGS
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
Equator Adjacent Repeat
Platform Launched Altitude Sensors
Crossing Orbits Coverage

Landsat 1 1972 912 km 8:50 a.m. 1 day 18 days RBV, MSS

Landsat 2 1975 912 km 9:08 a.m. 1 day 18 days RBV, MSS

Landsat 3 1978 912 km 9:31 a.m. 1 day 18 days RBV, MSS

Landsat 4 1982 705 km 9:45 a.m. 7 days 16 days MSS, TM

Landsat 5 1984 705 km 9:45 a.m. 7 days 16 days MSS, TM

Landsat 6 1993 705 km 10:00 a.m. 7 days 16 days MSS, ETM

Landsat 7 1999 705 km 10:00 a.m. 7 days 16 days ETM+

Landsat 8
2013 705 km 10:00 a.m 7 days 16 days OLI & TIRS
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
Landsat sensors
Resolution
Swath Width Sensor
Sensor Spectral Bands (µm)
Spatial (m) (km) Channels

RBV1 0.475-0.575 (green)


Return Beam Vidicon 80 (LS1,2)
185 RBV2 0.580-0.680 (red)
(RBV) 30 (LS3)
RBV3 0.690-0.830 (near IR)

MSS4 0.5-0.6 (green)


Multispectral Scanner MSS5 0.6-0.7 (red)
80 185
(MSS) MSS6 0.7-0.8 (near-IR)
MSS7 0.8-1.1 (near-IR)
TM1 0.45-0.52 (blue)

TM2 0.52-0.60 (green)

TM3 0.63-0.69 (red)


30 185 TM4 0.76-0.90 (near-IR)
Thematic Mapper (TM)
TM5 1.55-1.75 (mid-IR)

TM7 2.08-2.35 (mid-IR)

120 185 TM6 10.4-12.5 (thermal IR)


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
Landsat sensors
Resolution Swath Band Resoluti
Sensor Spectral Bands Spectral Range (µm) Sensor
Width Number on
Channels (µm)
Spatial (m) (km) 0.43–0.45 (coastal
1 OLI 30 m
185 0.520-0.900 blue)
15 ETM+8
(pan)
2 0.45–0.51 (blue) OLI 30 m
ETM+1 0.45-0.52 (blue) 3 0.53–0.59 (green) OLI 30 m

0.53-0.61 4 0.64–0.67 (red) OLI 30 m


ETM+2
(green)
5 0.85–0.88 (NIR) OLI 30 m
30 185 ETM+3 0.63-0.69 (red) 6 1.57–1.65 (SWIR-1) OLI 30 m

0.75-0.90 (near- 7 2.11–2.29 (SWIR-2) OLI 30 m


ETM+4
IR) 0.50–0.68
8 OLI 15 m
(panchromatic)
1.55-1.75 (mid-
ETM+5
IR) 9 1.36–1.38 (cirrus) OLI 30 m
2.09-2.35 (mid-
ETM+7 10 10.60–11.19 (TIR-1) TIRS 100 m
IR)
10.4-12.5
60 185 ETM+6 11 11.50–12.51(TIR-2) TIRS 100 m
(thermal IR)
Operational Land
Enhanced Thematic Mapper+ Imager (OLI) and Thermal Infrared
Sensor(TIRS)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

SPOT Satellites
Equator Adjacent Repeat
Platform Launched Altitude Sensors
Crossing Orbits Coverage
SPOT 1 1986 822 km 10:30 a.m. 1-5 days 26 days HRV
SPOT 2 1990
SPOT 3 1993
SPOT 4 1998 822 km 10:30 a.m. 1-5 days 26 days HRVIR
Vegetation
SPOT 5 2002 822 km 10:30 a.m. 1-5 days 26 days HRG
Vegetation

Resolution
Swath
Sensor Sensor
Satellite Launched Spatial Width Spectral Bands (µm)
Radiometric Channels
(m) (km)

PAN 1.5 11 bit 60 PAN 0.45-0.90


Band-1 0.455 – 0.525 (blue)
SPOT-6 2012
Band-2 0.53-0.59 (green)
MSS 6 11 bit 60
Band-3 0.625-0.695 (red)
SPOT-7 2014
Band-4 0.76-0.890 (near IR)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

SPOT Sensors
Resolution
Swath Sensor
Sensor Spatial (m) Spectral Bands (µm)
Width (km) Channels

High Resolution "P" mode: 10 60 PAN 0.51-0.73


Vertical (HRV) 117
"XS" mode: 20 60 HRV1 0.50-0.59 (green)
117
HRV2 0.61-0.68 (red)
HRV3 0.79-0.89 (near IR)
High Resolution "M" mode: 10 60 MONO 0.61-0.68
Vertical Infrared 117
(HRV-IR)
"Xi" mode: 20 60 HRV-IR1 0.50-0.59 (green)
117
HRV-IR2 0.61-0.68 (red)
HRV-IR3 0.79-0.89 (near IR)
HRV-IR4 1.58-1.75 (mid-IR)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

SPOT Sensors

Swath
Resolution Sensor
Sensor Width Spectral Bands (µm)
Spatial (m) Channels
(km)

High "P" mode: 60 PAN 0.48-0.71


Resolution 5.0 117
Geometric "Supermode
(HRG) ": 2.5
"Xi" mode: 60 HRG1 0.50-0.59 (green)
10 - 20 117
HRG2 0.61-0.68 (red)
HRG3 0.79-0.89 (near IR)
HRG4 1.58-1.75 (mid-IR)
Vegetation 1,000 2,250 VMI0 0.45-0.52 (blue)
Monitoring
VMI1 0.61-0.68 (red)
Instrument
(VMI) VMI2 0.78-0.89 (near IR)
VMI3 1.58-1.75 (mid-IR)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
IRS Satellite Series
Equator Repeat
Platform Launched Altitude Sensors
Crossing Coverage
1A:1988 LISS-I
IRS-1A, 1B 904 km 10:30 a.m. 22 days
1B: 1991 LISS-II
24 days LISS-III
1995
IRS-1C, 1D 817 km 10:30 a.m. 5 days PAN
1997
5 days WiFS
OCM
IRS-P4 (Oceansat) 1999 720 km 12 noon 2 days
MSMR

LISS-IV
IRS-P6
2003 817 km 10:30 a.m. 5-24 days LISS-III
RESOURCESAT-1
AWiFS

IRS-P5
2005 ~618 km 10.30 a.m. 126 days PAN
(CARTOSAT-1)
IRS P8 2008 4 days PAN
(CartoSat-2A) 635 km 10:30 a.m

2009
RISAT-2 550 km - - SAR-X

OCM
12 noon ± 10
Oceansat 2 2009 720 km 2 days SCAT
minutes
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IRS Satellite Series


Satellites Launched Sensors Spatial Spectral Radiometric Swath Revisit
Date Resolution Resolution Resolution (km) (days)
(m)
CartoSat-2B 12/07/10 PAN 80 cm 0.5 - 9.6 4-5 days
0.85 µm
Resourcesat 20/04/11 LISS-4 5.8m Same as 10bits 70 24 days
-2 LISS-3 23.5m Resourcesat 10bits 141
AWiFS 56m -1 12bits 740
Megha- 12/10/11 MADRAS
Tropiques SAPHIR
ScaRaB
ROSA
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IRS Sensors
Resolution
Swath
Sensor
Sensor Spatial Width Spectral Bands (µm)
Radiometric Channels
(m) (km)

Linear Imaging Self- 72 7 bit 148 LISS-I 1 0.45-0.52 (blue)


Scanning System I LISS-I 2 0.52-0.59 (green)
(LISS-I)
LISS-I 3 0.62-0.68 (red)
LISS-I 4 0.77-0.86 (near IR)
Linear Imaging Self- 36 7 bit 74 LISS-II 1 0.45-0.52 (blue)
Scanning System II
(LISS-II) LISS-II 2 0.52-0.59 (green)
LISS-II 3 0.62-0.68 (red)
LISS-II 4 0.77-0.86 (near IR)
Linear Imaging Self- 23 7 bit 142 LISS-III 2 0.52-0.59 (green)
Scanning System III LISS-III 3 0.62-0.68 (red)
(LISS-III)
LISS-III 4 0.77-0.86 (near IR)
LISS-III 5 1.55-1.70 (mid-IR)
50 148
PAN 6 7 bit 70 PAN I 0.5-0.75
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IRS Sensors ….
Resolution Swath
Sensor
Sensor Width Spectral Bands (µm)
Spatial(m) Radiometric Channels
(km)
Linear 5.8 7 bit 24-70 LISS IV- 2 0.52-0.59 (green)
Imaging Self-
Scanning LISS IV- 3 0.62-0.68 (red)
System IV
(LISS-IV)
LISS IV- 4 0.77-0.86 (near IR)

Wide Field 188 7 bit 774 WIFS -1 0.62-0.68 (red)


Sensor
(WiFS) WIFS 2 0.77-0.86 (near IR)
Advanced 56 10 bit 370-740 AWIFS -1 0.52-0.59 (green)
Wide Field
Sensor AWIFS 2 0.62-0.68 (red)
(AWiFS)
AWIFS -3 0.77-0.86 (near IR)

AWIFS 4 1.55-1.70 (mid-IR)

PAN Fore/Aft 2.5 10 bit 30 /27 PAN-1 .50-.75


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
High Spatial Resolution Satellites
Resolution
Swath
Sensor
Sensor Width Spectral Bands (µm)
Spatial (m) Radiometric Channels
(km)

1 11 bit 13 PAN 0.45-0.90


IKONOS-1 0.45-0.52 (blue)

IKONOS 4 11 bit 13 IKONOS-2 0.52-0.60 (green)

IKONOS-3 0.63-0.69 (red)

IKONOS-4 0.76-0.90 (near IR)

0.61 (nadir) 11 bit 16.5 PAN 0.45-0.90


0.72 (off nadir)
Band-1 0.45-0.52 (blue)
Quick Bird 2.44 (nadir)
2.88(off nadir) 11 bit 16.5 Band-2 0.52-0.60 (green)

Band-3 0.63-0.69 (red)

Band-4 0.76-0.90 (near IR)


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

High Spatial Resolution Satellites


Satellites Launched Sensors Spatial Spectral Resolution Swath Altitude Revisit (days)
Resolution (km) (km)
(m)
Worldview-1 2007 PAN 0.50 meters 480-830 nm 17.6 496 1.7 days at 1 meter GSD
GSD at nadir or less
0.55 meters 5.9 days at 20° off-nadir
GSD at 20° or less (0.51 meter GSD)
off-nadir
GeoEye-1 2008 PAN 0.41m(old) PAN: 450 - 800 nm 3 days
0.46m(new) 15 681 (old)
770 (new)
Multispectral 1.56m (old) Blue: 450 - 510 nm
1.84m (new) Green: 510 - 580 nm
Red: 655 - 690 nm
NIR: 780 - 920 nm
Worldview-2 2009 PAN 0.46m 450-800 nm 16.4 770 1.1 days (at 1 meter GSD
Multispectral 1.84m Coastal: 400-450 nm or less )
Blue: 450-510 nm 3.7 days (at 20° off-nadir
Green: 510-580 nm or less (0.52 meter
Yellow: 585-625 nm GSD))
Red: 630-690 nm
Red Edge: 705-745 nm
NIR1: 770-895 nm
NIR2: 860-1040 nm
Pleiades-1A 2011 PAN 0.50 P: 480-830 nm 20 694 Daily
Multispectral 2.0 Blue: 430-550 nm
Green: 490-610 nm
Pleiades-1B 2012 Red: 600-720 nm
NIR: 750-950 nm
Skysat-1 2013 PAN 0.9 P: 400-900 nm 8 450

Skysat-2 2014 Multispectral 2.0 Blue: 450-515 nm


Green: 515-595 nm
Red: 605-695 nm
NIR: 740-900 nm
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

KOMPSAT (Korea Multi-Purpose Satellite) / Arirang


Satellites Launched Sensors Spatial Spectral Resolution Swath Altitude
Resolution (km) (km)
(m)
EOC 6.6 400-900 nm 17 km 685 km
(Electro
KOMPSAT-1 1999 Optical
Camera)

PAN 1 500-900 nm 15 km 685 km

KOMPSAT-2 2006
MSC 4 450-520 nm, blue
(Multispec 520-600 nm, green
tral 630-690 nm, red
Camera 760-900 nm, NIR
PAN 0.7 450-900 nm 15 km 685 km

KOMPSAT-3 2012 AEISS 2.8 450-520 nm, blue


(Advanced 520-600 nm, green
Earth 630-690 nm, red
Imaging 760-900 nm, NIR
Sensor
System)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

High Spatial Resolution Satellites


Satellites Launched Sensors Spatial Spectral Resolution Swath Altitude
Resolution (km) (km)
(m)
PAN 0.31m 450-800 nm
Multispectral 1.34m Coastal: 400-450 nm
Blue: 450-510 nm
Green: 510-580 nm
Yellow: 585-625 nm
Red: 630-690 nm
Red Edge: 705-745 nm
NIR1: 770-895 nm
13 Aug NIR2: 860-1040 nm 13.1 617
Worldview-3 2014 SWIR bands 3.7 m SWIR-1: 1195-1225 nm
SWIR-2: 1550-1590 nm
SWIR-3: 1640-1680 nm
SWIR-4: 1710-1750 nm
SWIR-5: 2145-2185 nm
SWIR 6: 2185-2225 nm
SWIR-7: 2235-2285 nm
SWIR-8: 2295-2365 nm
CAVIS bands 30 m Desert Clouds: 405-420 nm
Aerosol-1: 459-509 nm
Green: 525-585 nm
Aerosol-2: 635-685 nm
Water-1: 845-885 nm
Water-2: 897-927 nm
Water-3: 930-965 nm
NDVI-SWIR: 1220-1252 nm
Cirrus: 1365-1405 nm
Snow: 1620-1680 nm
Aerosol-3: 2105-2245 nm
Aerosol-4: 2105-2245 nm
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Meteorological Satellites
 Designed specifically for weather prediction and monitoring

 Advantages of global coverage at very high temporal resolution.

 Various types of meteosats are as follows:


e.g. NOAA series (operated by U.S. named after the National
Oceanic and Atmospheric Administration). These have near-polar,
sun-synchronous orbits.

GOES and INSAT series satellites are in geo-stationary


orbits.

 India has launched INSAT series satellites, which are


telecommunication, and meteorological satellites.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 http://earthexplorer.usgs.gov/
 http://www.nrsc.gov.in/
 http://www.spaceimaging.com
 http://www.digitalglobe.com
 http://edcimswww.cr.usgs.gov/pub/imswelcome/
 http://www.spotimage.fr/home
 http://bhuvan-noeda.nrsc.gov.in/download/download/download.php
 http://glcf.umiacs.umd.edu/data/
 http://www.usgs.gov/pubprod/
 https://cross.restec.or.jp/cross-ex/topControl.action?language=en-US
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Contact :

Email- vinaykumar@iirs.gov.in
Tel-01352524112
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

SPECTRAL SIGNATURES AND


IMAGE INTERPRETATION

HINA PANDE
Photogrammetry &Remote Sensing
Department
hina@iirs.gov.in
1
8/13/2015
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

E
B

F G
E
C
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

The Electromagnetic Spectrum


• The electromagnetic spectrum
ranges from the shorter
wavelengths (including gamma and
x-rays) to the longer wavelengths
(including microwaves and
broadcast radio waves).
• There are several regions of the
electromagnetic spectrum which
are useful for remote sensing.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

IMAGE INTERPRETATION
 Analysis of remote sensing imagery involves the
identification of various targets in an image.

 Targets may be defined in terms of the way they


reflect or emit radiation.

 This radiation is measured and recorded by a sensor,


and ultimately is depicted as an image product such
as an air photo or a satellite image.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Act of examining images to identify objects and judge


their significance.

• Information extraction process from the images.

• An interpreter is a specialist trained in study of


photography or imagery, in addition to his own
discipline.

• Involves a considerable amount of subjective judgment.


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Image is a pictorial representation of an object or a scene.

• Image can be analog or digital.

• A digital image is made up of square or rectangular areas


called pixels.

• Each pixel has an associated pixel value which depends


on the amount reflected energy from the ground.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

What makes interpretation of imagery more difficult than the


everyday visual interpretation of our surroundings?
• We lose our sense of depth when viewing a two-
dimensional image, unless we can view it
stereoscopically so as to simulate the third dimension of
height.
• Viewing objects from directly above also provides a very
different perspective than what we are familiar with.
• Combining an unfamiliar perspective with a very different
scale and lack of recognizable detail can make even the
most familiar object unrecognizable in an image.
• Finally, we are used to seeing only the visible
wavelengths, and the imaging of wavelengths outside of
this window is more difficult for us to comprehend.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

• Spectral resolution = part of the EM spectrum measured.

• Radiometric resolution = smallest differences in energy


that can be measured.

• Spatial resolution = smallest unit area measured.

• Revisit time (temporal resolution) = time between two


successive image acquisitions over the same area.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Advantages of Using Images over


ground observation
 Synoptic view
 Time freezing ability
 Permanent record
 Spectral resolution
 Spatial resolution
 Cost and time effective
 Stereoscopic view
 Brings out relationship between objects
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Signature
 Identity is whatever makes an entity recognizable.
 A signature is that which gives an object or piece of
information its identity.

George Bush Elizabeth Taylor Shahrukh Khan


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Signature

 Characteristic feature which forms key to enable an


object to be identified.
 Spectral, Spatial, temporal and polarization variations
which facilitate discrimination of features on remotely
sensed data.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

What is a spectral reflectance curve ?


A spectral reflectance curve is a graph of the spectral
reflectance of an object as a function of wavelength and
is very useful for choosing the wavelength regions for
remotely sensed data acquisition for a certain
application.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Significance of spectral signature in


remote sensing

 Spectral responses measured by RS sensors over


various features.
 Spectral reflectance & spectral emittance curves.
 Variability of spectral signature: useful for evaluation of
condition, not for spectral identification of earth features.
 Temporal and spatial effects on spectral response
patterns.
 Change detection depends on temporal effects.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Signature for Vegetation

 A general characteristic of vegetation is its green


colour caused by the pigment chlorophyll.
 Chlorophyll reflects green energy more than red and
blue energy, which gives plants green colour.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Fig: Spot image

Fig: IKONOS image


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 The major difference in leaf reflectance between


species, are dependent upon leaf thickness.
 It affects both pigment content and physiological
structure. c itru s
to m a to
60

Reflectance (%)
s o rg h u m
c o tto n

40

20

0
0 .5 1 .0 1 .5 2 .0 2 .5
W a v e le n g th ( m )

Thick leaf Thin leaf


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 Leaf reflectance is reduced as a result of absorption


by three major water absorption bands that occur
near wavelengths of 1.4 m, 1.9 m and 2.7 m and
two minor water absorption bands that occur near
wavelengths of 0.96 m, and 1.1 m
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Needle-leaf trees canopies reflect significantly less near-


infrared radiation compared to broad-leaf vegetation.

Coniferous forest Deciduous forest


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Immature leaves contain less chlorophyll and fewer air


voids than older leaves, they reflect more visible light
and less infrared radiation.

Mature plant

Immature plant
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Reflectance is also affected by health of vegetation

Healthy plants

100

80
Reflectance (%)

60

40
(21)
20 (24)
Infected plants
0
0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1
Wavelength (m)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Signature for Soil

The five characteristics of a soil that determine its


reflectance properties are, in order of importance:
• Moisture content
• Organic content
• Structure
• Iron oxide content
• Texture
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Soil Moisture:
 A wet soil generally appears darker
 Increasing soil moisture content lowers reflectance but
did not change shape of the curve

Water absorption
band
Hydroxyl absorption
band
Dry soil
Percent Reflectance

Wet soil
Wavelength in nanometer
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Organic content:
 A soil with 5% or more organic
matter usually appears black in
colour

 Less decomposed organic


materials have higher reflectance
in the near ir

 Very high decomposed organic


materials show very low
reflectance throughout the
reflective region of the solar
spectrum

Wavelength in micrometer

Representative reflectance spectra for organic soils with:


(a) Minimally (Fibric); (b) Partially (Hemic) and (c) Fully
(Sapric) decomposed organic fibers.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

SOIL – IRON CONTENT

 The presence of iron


especially as iron oxide
affects the spectral
reflectance b

 Reflectance in the green c

region decreases with d


e
increased iron content, but a
d
d

increases in the red region


Wavelength in micrometer

 Iron dominated soils have Representative reflectance spectra of surface samples of 5


minerals soils; (a) High organic content, moderately fine
strong absorption in mir (> texture; (b) Low organic, Low iron content; (c) Low organic,

1.3 m) medium iron content; (d) High organic content, moderately
coarse texture and (e) High iron content, fine texture.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Soil structure
 A clay soil tends to have a strong structure, which
leads to a rough surface on ploughing; clay soils also
tend to have high moisture content and as a result
have a fairly low diffuse reflectance.
 Sandy soils also tend to have a low moisture content
and a result have fairly high and often specular
reflectance properties.

Clayey soil Sandy soil


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Signature for Water


 Reflection of Light -
Wavelengths
 Water Depths – Shallow ,
Deep
 Suspended material
 Chlorophyll Content
 Surface Roughness
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Spectral Reflectance of Water


 The majority of radiant flux
incident upon water is either
not reflected but is either
absorbed or transmitted.
 In visible wavelengths of EMR,
little light is absorbed, a small
amount, usually below 5% is
reflected and the rest is
transmitted.
 Water absorbs NIR and MIR
strongly leaving little radiation
to be either reflected or
transmitted. This results in
sharp contrast between any
water and land boundaries.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Factors governing spectral Reflectance of


Snow
 GRAIN SIZE (HENCE AGE)
 Reflectance falls at all wavelengths as grain size
increases
 SNOW PACK THICKNESS
 Reflectance of snow decreases as it ages

 LIQUID WATER CONTENT


 Even slightly melting snow reduces reflectance

 CONTAMINANT PRESENT
 Contaminations (soot, particles, etc.) Reduce snow
reflection in the visible region.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 The lines in the figure represent


average reflectance curves
compiled by measuring large
sample features.

 Observe how distinctive the curves


are for each feature.

 The configuration of these curves is


an indicator of the type and
condition of the features to which
they apply.

 Although the reflectance of


individual features will vary
considerably above and below the
average, these curves demonstrate
some fundamental points
concerning spectral reflectance
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Band (.45 to .515m) Band (.525 to .605 m) Band (.63 to .690 m)

Band (.75 to .90 m) Band (1.55 to 1.75 m) Band (2.09 to 2.35m)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

True Color
Composite (3,2,1)

False Color
Composite (4,3,2)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Landsat ETM (IR R G)


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Band Wavelength Principal applications


(μm)

1 0.45 – 0.52 Penetration of clear water: bathymetry; mapping of coastal waters;


(blue) chlorophyll absorption; distinction between deciduous and
coniferous vegetation.
2 0.52 – 0.60 Records the green reflectance peak of vegetation; assesses plant
(green) vigor; reflectance from turbid water.
3 0.63 – 0.69 This band operates in the chlorophyll absorption region and is best
(red) for detecting roads, bare soil.
4 0.76 – 0.90 This band is used to estimate biomass. Although it separates water
(near- bodies from vegetation and discriminates soil moisture, it is not as
infrared) effective as B3 for road identification.
5 1.55 – 1.75 Band 5 is considered to be the best single band overall. It
(mid- discriminates roads, bare soil, and water. It also provides a good
infrared) contrast between different types of vegetation and has excellent
atmospheric and haze penetration. Discriminates snow from
clouds,
6 2.08 – 2.35 This band is useful for discriminating mineral and rock types and
(mid- for interpreting vegetation cover and moisture.
infrared)
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Methods of Image Interpretation


 Visual

1. Visual image interpretation on a hardcopy


image/photograph

2. Visual image interpretation on a digital image

 Digital image processing


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Types of interpretation

 Qualitative

 Quantitative
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Basic Principle of Image Interpretation


 Image is a pictorial representation of pattern of
landscape.
 Pattern indicates type of objects and their physical,
biological, and cultural relationships
 Similar objects under similar conditions reflect similarly.
 A systematic examination of photos and supporting
material.
 Interpretation is made of physical nature of the object.
 Information extracted is proportional to knowledge, skill
and experience of analyst; the methods and equipment
used.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Factors governing interpretability

 Training, Experience
 Nature of object or phenomenon
 Quality of photographs
 Equipment and method of interpretation
 Interpretation keys, guides, manuals and other aids
 Prior knowledge of the area.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Methodology depends on………


 Kind of information to be interpreted
 Accuracy of the results to be obtained
 The reference level of the person executing the
interpretation
 Kind and type of imagery or photographs available
 Instruments available
 Scale and other requirements of the final map
 External knowledge available and any other
sensory surveys that have been or will be made
in the near future in the same area.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

ACTIVITIES OF IMAGE
INTERPRETATION
 Detection
 Recognition
 Analysis
 Deduction
 Classification
 Idealization
 Convergence of evidence
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Data Selection Criteria


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

ELEMENTS OF IMAGE
INTERPRETATION
 Recognizing targets is the key to interpretation and
information extraction.
 Observing the differences between targets and their
backgrounds involves comparing different targets
based on any, or all, of the visual elements of tone,
shape, size, pattern, texture, shadow, and association.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Tone
 Tone refers to the
relative brightness or
colour of objects in an
image.
 Generally, tone is the
fundamental element
for distinguishing
between different
targets or features.
 Variations in tone also
allows the elements of
shape, texture, and
pattern of objects to be
distinguished.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Shape
 Shape refers to the
general form, structure,
or outline of individual
objects.
 Shape can be a very
distinctive clue for
interpretation.
 Straight edge shapes
typically represent urban or
agricultural (field) targets,
while natural features, such
as forest edges, are
generally more irregular in
shape, except where man
has created a road or clear
cuts.
 Farm or crop land irrigated by rotating sprinkler systems
would appear as circular shapes
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Size
 Size of objects in an image is a
function of scale.
 It is important to assess the size
of a target relative to other
objects in a scene, as well as the
absolute size, to aid in the
interpretation of that target.
 A quick approximation of target
size can direct interpretation to
an appropriate result more
quickly.
For example, if an interpreter had to distinguish
zones of land use, and had identified an area with a
number of buildings in it, large buildings such as
factories or warehouses would suggest commercial
property, whereas small buildings would indicate
residential use.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Pattern
 Pattern refers to the spatial
arrangement of visibly
discernible objects.

 Typically an orderly repetition of


similar tones and textures will
produce a distinctive and
ultimately recognizable pattern.

Orchards with evenly spaced trees, and urban streets with


regularly spaced houses are good examples of pattern.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Texture
 Texture refers to the
arrangement and frequency of
tonal variation in particular
areas of an image.

 Texture is one of the most


important elements for
distinguishing features in radar
imagery.

Rough textures would consist of a mottled tone where the


grey levels change abruptly in a small area, whereas
smooth textures would have very little tonal variation.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Shadow
 Shadow may provide an idea of
the profile and relative height of a
target or targets which could make
identification easier.

 However, shadows can also


reduce or eliminate interpretation in
their area of influence, since targets
within shadows are much less (or
not at all) discernible from their
surroundings.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Association
 Association takes into account
the relationship between other
recognizable objects or features
in proximity to the target of
interest.
 The identification of features that
one would expect to associate
with other features may provide
information to facilitate
identification.
Commercial properties may be associated with proximity to
major transportation routes, whereas residential areas
would be associated with schools, playgrounds, and sports
fields.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Identify the following features :

Race track river roads bridges residential area dam


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Methods of analysis and reference levels

 Monocular and stereo analysis


 Multiple images
 Multi band images
 Multi date images
 Multi stage images
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Image interpretation for Multispectral


imagery

 Resolution
 Stereoscopic ability
 Individual Band Interpretation
 Temporal data
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Sensors in photographic Image


Interpretation
 Black and white panchromatic
 Black and white infrared
 Colour
 Colour infrared/ false colour
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

B/W Aerial Photo


In the upper
right corner
there is a
hydrofoil ship.
The other big
ship is a
tanker
delivering
fresh water to
the island

0.3 µm to 0.9 µm. Panchromatic


films
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Colour and false colour (or colour infrared) Images:


For a normal colour photograph, the layers are sensitive to
blue, green, and red light - the same as our eyes.
Accordingly, these photos appear to us the same way that
our eyes see the environment. The colors resemble those
which would appear to us as "normal" (i.e. trees appear
green, etc.).
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Mid-Infrared Image of Stromboli Island.


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Landsat image
Boston
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Fig : Landsat image


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 The circular
features indicate
sprinkler irrigation
systems.

 Red color
indicates crop &
dark color fallow
land.

Fig : Saudi Arabia, Sensor : IRS1C LISS III


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

This multispectral image


shows Marmagoa &
Tiswadi areas of Goa
state. This image also
shows the sedimentation
in the River Zuari & River
Mandovi & the red
patches represent densely
vegetated areas. The
Dabolim airport near the
town of Vascodagama is
also visible in the lower
Fig : Goa city , India middle part of the image.
Sensor : IRS1D LISS III
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

This image shows East


Coast of India &
sunderbans. Waters in the
shallow areas near coast
are seen in light blue color.
The mangroves are seen
in bright red color in the
wet land areas. The river
Hoogly dispersing
sediments into the sea can
be seen clearly. .
Sunderbans , West Bengal India
Sensor : IRS1C WiFS
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

The gulf of Oman is


seen here.
The mountains &
rocky terrain of the
area are seen
through WiFS sensor
.

Part of Oman
Sensor : IRS1D WiFS
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

This image shows


part of Dhaka city.
Features like
stadium & city
airport are clearly
seen.

Dhaka , Bangladesh
Sensor : IRS1C LISS III+PAN
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

This image shows


part of Rome.

The runways of the


'h' shaped airport
can also be seen.

Rome , Italy
Sensor : IRS1C LISS III+PAN
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Roads, rivers,
water bodies,
topography and
urban areas can
all be
distinguished.

Spot multi-Spectral image


I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

This one-meter resolution satellite


image of the Pentagon was
collected on Sept. 7, 2001 by
Space Imaging's IKONOS
satellite, only four days before the
terrorist attack.

This satellite image of the Pentagon


was collected at 11:46 a.m. EDT on
Sept. 12, 2001 by Space Imaging's
IKONOS satellite. The image shows
extensive damage to the western side
and interior rings of the multi-ringed
building. Also visible are the
emergency and rescue vehicles
parked around the helipad.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Tricastin Nuclear Facility, France - June 2002


Pan-sharpened multi-spectral
Resolution, 70 cm
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Manual vs Digital
 Manual interpretation and Digital processing and analysis is

analysis dates back to the early more recent with the advent of
beginnings of remote sensing digital recording of remote
for air photo interpretation. sensing data and the
 Manual interpretation requires development of computers.
little, if any, specialized Digital analysis requires
equipment. specialized, and often
 Manual interpretation is often expensive, equipment.
limited to analyzing only a single  The computer environment is
channel of data or a single more amenable to handling
image at a time.. complex images of several or
 Manual interpretation is a many channels or from several
subjective process, meaning dates.
that the results will vary with  Digital analysis is based on the
different interpreters. manipulation of digital numbers
in a computer and is thus more
objective, generally resulting in
more consistent results.
However, determining the validity and accuracy of the results from
digital processing can be difficult.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

 It is important to reiterate that visual and digital


analyses of remote sensing imagery are not mutually
exclusive.
 In most cases, a mix of both methods is usually
employed when analyzing imagery.
 The ultimate decision of the utility and relevance of
the information extracted at the end of the analysis
process, still must be made by humans.
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

The material for the presentation has been compiled from


various sources- books, tutorials, lecture notes, several
resources on the www. and Contributions from Ms.Shefali
Agrawal, Ms.Minakshi Kumar & Ms.Poonam S. Tiwari
I N D I A N I N S T I T U T E O F R E M O T E S E N S I N G, D E H R A D U N

Contact Details of the Faculty:

Email-
Tel-
BASICS OF REMOTE SENSING

Definition
Remote sensing means acquiring information about a phenomenon object or
surface while at a distance from it. This name is attributed to recent technology in
which satellites and spacecraft are used for collecting information about the
earth's surface. This was an outcome of developments in various technological
fields from 1960 onward.
Principle of Remote Sensing
Detection and discrimination of objects or surface features means detecting and
recording of radiant energy reflected or emitted by objects or surface material.
Different objects return different amount and kind of energy in different bands of
the electromagnetic spectrum, incident upon it. This unique property depends on
the property of material (structural, chemical, and physical), surface roughness,
angle of incidence, intensity, and wavelength of radiant energy.
The Remote Sensing is basically a multi-disciplinary science which includes a
combination of various disciplines such as optics, spectroscopy, photography,
computer, electronics and telecommunication, satellite launching etc. All these
technologies are integrated to act as one complete system in itself, known as
Remote Sensing System. There are a number of stages in a Remote System,
working as links in a complete, and each of them is important for successful
operation.
Stages in Remote Sensing
1. Emission of electromagnetic radiation, or EMR (sun/self- emission)
2. Transmission of energy from the source to the surface of the earth, as well
as absorption and scattering
3. Interaction of EMR with the earth's surface: reflection and emission
4. Transmission of energy from the surface to the remote sensor
5. Sensor data output
6. Data transmission, processing and analysis

1
Figure 1: Remote Sensing process
What We See
At temperature above absolute zero, all objects radiate electromagnetic energy by
virtue of their atomic and molecular oscillations. The total amount of emitted
radiation increases with the body's absolute temperature and peaks at
progressively shorter wavelengths. The sun, being a major source of energy,
radiation and illumination, having a sharp power peak around 0.5 m, allows
capturing reflected light with conventional (and some not-so-conventional)
cameras and films.
The basic strategy for sensing electromagnetic radiation is clear. Everything in
nature has its own unique distribution of reflected, emitted and absorbed
radiation. These spectral characteristics, if ingeniously exploited, can be used to
distinguish one thing from another or to obtain information about shape, size and
other physical and chemical properties. In so far as we know the spectral
characteristics, we can pick an appropriate detector to make the desired
measurement, remembering that for a given collector's diameter we get our
greatest spatial resolution where wavelengths are shortest and energies greatest,
and that these energies decrease at longer wavelengths and distances.
Modern Remote Sensing Technology versus Conventional Aerial Photography
The use of different and extended portions of the electromagnetic spectrum,
development in sensor technology, different platforms for remote sensing
(spacecraft, in addition to aircraft), emphasis on the use of spectral information as
compared to spatial information, advancement in image processing and
enhancement techniques, and automated image analysis in addition to manual
2
interpretation are points for comparison of conventional aerial photography with
modern remote sensing system.
During early half of twentieth century, aerial photos were used in military surveys
and topographical mapping. Main advantage of aerial photos has been the high
spatial resolution with fine details and therefore they are still used for mapping at
large scale such as in route surveys, town planning, construction project surveying,
cadastral mapping etc. Modern remote sensing system provide satellite images
suitable for medium scale mapping used in natural resources surveys and
monitoring such as forestry, geology, watershed management etc. However the
future generation satellites are going to provide much high-resolution images for
more versatile applications.
Remote Sensing in India
Indian remote sensing programme was started in late seventies with Indian built
Bhaskara-I and Bhaskara-II satellites launched from Soviet Union in June 1978 and
November 1981 respectively. Indigenous launch capability has also been
developed to launch 1000 Kg. Remote Sensing Satellite into polar orbit. The
Department of Space (DOS)/ Indian Space Research Organisation (ISRO) has been
actively engaged in development of state of art remote sensing capabilities along
with other related activities. ISRO Satellite Centre (ISAC), Bangalore is responsible
for design, development and management of remote sensing satellites. Space
Application Centre (SAC), Ahmedabad is engaged in development of sensors
(cameras & scanners) & data processing software, analysis of remote sensing data,
and related research. National Remote Sensing Agency (NRSA), Hyderabad is
responsible for acquisition, processing and dissemination of remote sensing data,
analysis of data for different applications and training of users. The National
Natural Resources Management System (NNRMS) is responsible for execution of
application projects through the establishment of a no. of Regional Remote
Sensing Service Centre (RRSSC) throughout the country. Many user agencies, Govt.
Departments, State Governments, and academic institutes have also established
remote sensing infrastructure for various applications.

Electromagnetic Radiation (EMR) and the Electromagnetic Spectrum


EMR is a dynamic form of energy that propagates as wave motion at a velocity of c
= 3 x 10 10 cm/sec. The parameters that characterize a wave motion are
3
wavelength (), frequency () and velocity (c). The relationship between the
above is
c = .

Figure . 2 Electromagnetic wave. It has two components, Electric field E and


Magnetic field M, both perpendicular to the direction of propagation

Electromagnetic energy radiates in accordance with the basic wave theory. This
theory describes the EM energy as traveling in a harmonic sinusoidal fashion at
the velocity of light. Although many characteristics of EM energy are easily
described by wave theory another theory known as particle theory offers insight
into how electromagnetic energy interacts with matter. It suggests that EMR is
composed of many discrete units called photons/quanta. The energy of quantum
is
Q = hc /  = h c

Where Q is the energy of quantum, h = Planck’s constant

4
Figure 3 : Electro Magnetic Spectrum

Table1: Principal Divisions of the Electromagnetic Spectrum

Wavelength Description
Gamma rays Gamma rays
X-rays X-rays
Ultraviolet (UV) This region is beyond the violet portion of the
region visible wavelength, and hence its name. Some
0.30m - .38 m earth's surface material primarily rocks and
(1m = 10-6m) minerals emit visible UV radiation. However UV
radiation is largely scattered by earth's
atmosphere and hence not used in field of
remote sensing.
Visible Spectrum This is the light, which our eyes can detect. This
0.4 m - 0.7 m is the only portion of the spectrum that can be
associated with the concept of colour. Blue
5
Violet 0.4 m -0 .446 Green and Red are the three primary colours of
m the visible spectrum. They are defined as such
Blue 0.446 m -0 .5 because no single primary colour can be created
m from the other two, but all other colours
Green 0.5 m - 0.578 can be formed by combining the three in
m various proportions. The colour of an object is
Yellow 0.578 m - defined by the colour of the light it reflects.
0.599 m
Orange 0.592 m -
0.62 m
Red 0.62 m -0 .7 m
Infrared (IR) Spectrum Wavelengths longer than the red portion of the
0.7 m – 100 m visible spectrum ore designated as the infrared
spectrum. British Astronomer William Herschel
discovered this in 1800. The infrared region can
be divided into two categories based on their
radiation properties.

Reflected IR (.7m - 3.O m) is used for remote


sensing. Thermal IR (.3m - 10Om) is the
radiation emitted from earth's surface in the
form of heat.
Microwave Region This is the longest wavelength used in remote
1 m - 30 cm sensing. The shortest wavelengths in this range
have properties similar to thermal infrared
region. The main advantage of this spectrum is
its ability to penetrate through clouds.
Radio Waves This is the longest portion of the spectrum
(>30 cm) mostly used for commercial broadcast and
meteorology.

Electromagnetic Radiation Quantities: Nomenclature, Definition and Units


Radiant Energy (Q): Radiant energy is the energy carried by EMR. Radiant energy
causes the detector element of the sensor to respond to EMR in some appropriate
manner. Unit of Radiant Energy Q is Joule.

6
Radiant Flux () (Phi) is the time rate of flow of radiant energy. Unit of Radiant
flux is Joule/second or watt (W).
Irradiance (E): Radiant flux intercepted by a plane surface per unit area of the
surface. The direction of the flux is not specified. It arrives at the surface from all
directions within a hemisphere over the surface. Unit of Irradiance E is W/m2 or
Wm-2 (Watt per square meter).

Figure 4 Concept of Irradiance


Exitance (emittance) (M); Radiant flux leaving a surface per unit area of the
surface. The flux may leave the surface in any or all directions within a
hemisphere over the surface.
Solid Angle: is the cone angle subtended by the portion of a spherical surface at
the center of the sphere. It is equal to the area of the spherical surface divided by
the square of the radius of the sphere. Its unit is steradian (sr) and it is denoted
by (omega).

Figure.5 Concept of Solid angle in angular measurement

7
Radiant Intensity (I): of a point source in a given direction is the radiant flux per
unit solid angle leaving the source in that direction. Unit of Radiant Intensity is
Watts/sr

Figure 6. Concept of Radiant Intensity

Radiance (L): is defined as the radiant flux per unit solid angle leaving an extended
source in a given direction per unit projected area of the source in that direction.
The concept of radiance is intended to correspond to the concept of brightness.
The projected area in a direction which makes an angle  (Theta) with the normal
to the surface of area A is A cos
The relationship between radiant intensity and radiance is

I = L.A. cos  Unit for Radiance is W m-2 sr-1

8
Figure 7. Concept of Radiance

Lambertian Surface: The plane source or surface for which the radiance L does
not change as a function of angle of view is called Lambertian (perfectly diffused).
For a Lambertian surface:
a) The irradiance in the retinal image does not change with viewing
angle for a lambertian panel.
b) For a Lambertian surface, the existence and radiance are related by
Existence M =  * radiance L (where  = 3.1415927)
Thermal Emission of Radiation
All objects at all temperatures emit electromagnetic radiation at all wavelengths.
The thermal emission of radiation is due to the conversion of heat energy, which is
the kinetic energy of the random motion of the particles of the matter, into
electromagnetic energy.
Thermal emission of radiation depends upon two parameters, as follows.
 Absolute Temperature (T)
 Emissivity ()
The total thermal radiation from a body increases with fourth power of T. The
absolute temperature is given by T = 273 + temperature in degrees centigrade in
units of degrees Kelvin (oK).
The emissivity factor () is the characteristics of the material, a measure of its
capability to emit radiation due to thermal energy conversion. The emissivity of a
substance is related to its absorptive ability. Good absorbers are good radiators,
whereas poor absorbers are poor radiators. For an ideal thermal emitter, called a
blackbody,  = 1.

Radiation Laws

Planck's Law

9
The spectral exitance for a blackbody is given by Planck's Law.

M = C1 -5 [exp. (C2/  T) - 1]-1

C1 and C2 are constant,  is the wavelength and T is the absolute temperature.

The spectral exitance of a blackbody is not the same at all wavelengths.


The spectral exitance is low for very short and very long wavelengths. It reaches
the maximum value for some wavelengths in between, depending on the
temperature of the blackbody. A blackbody at higher temperature emits more
radiation than a blackbody at lower temperature at all the wavelengths.

The Stephen-Boltzman Law:

The total radiation emitted by a blackbody in the entire electromagnetic


spectrum is obtained by integrating the area under Planck's distribution curve. It
is given by the Stephen-Boltzman Law.

M (blackbody) =  T4

where  = 5.67 x 10-8 W/m2 (oK)4

Wien's Displacement Law

The wavelength for which the spectral exitance has its maximum value is
given by  m in microns = 2,898/T.

For a blackbody at T = 300oK, the peak in spectral emission occurs at m 


= 10 m. Hence, objects at ambient temperature emit in the infra-red region of
the spectrum. As the temperature of the object is raised, the peak emission shifts
toward the shorter wavelength side. The radiation in the infra-red portion of the
spectrum is often referred to as heat rays, as it produces the sensation of heat. At
T = 6,000oK, about the temperature of the sun, the peak emission occurs in the
visible region of the spectrum ( m = 0.5 m). The production of light by heating
and the shift of intensity toward shorter wavelengths by increasing the
temperature is illustrated by passing a current through a blackened platinum wire
in a dark room. As the magnitude of the current is increased, the wire becomes
10
hot, glows red initially, then the colour of the emitted radiation changes to
orange, then to yellow and ultimately to white.

Spectral Emissivity and Kirchoff's Law

The three radiation laws mentioned above hold good for blackbody
radiation only. All other substances are characterized by their spectral emissivity
 ( ), defined as the ratio of spectral exitance of the material to the spectral
exitance of a blackbody at the same temperature.

 () = M (material, oK) / M (blackbody oK)

Knowing the spectral emissivity of a body, its spectral exitance, total


exitance and the wavelength of peak emission can be determined.

Kirchoff's law states that the spectral emissivity of a material is equal to its
spectral absorptivity, i.e  ( ) =  ( ). This implies that if a body is capable of
emitting certain radiation, it will absorb that radiation when exposed to it.

The emissivity characteristics of material can be summarized as follows.

 Blackbody : = 1 at all wavelengths.

 Grey body : 0 <  < 1 (does not depend upon wavelength)

 Imperfect blackbody (perfect reflector):  = 0

 All other bodies:  =  (  ) is a function of wavelength.

The relationship between reflectance, absorptivity and transmittance was


given by  ( ) +  ( ) +  ( ) = 1. It can now be written as  ( ) +  ( ) +
 ( ) = 1. For opaque substances,  ( ) = 0; hence, emissivity and reflectance are
related by  ( ) = 1 -  ( ).

11
Characteristics of Solar Radiant Energy
The sun is the strongest and most important source of radiant energy for remote
sensing. The solar spectrum extends approximately from 0.3 m to 3.0 m. The
maximum irradiance occurs at 0.47 m. The visible band from 0.40 m to 0.76
m receives about 46 per cent of the total solar energy.
The rate at which the total solar radiant energy flows across a unit area normal to
the direction of propagation located at a mean distance of the earth from the sun
is called the solar constant. The value of this constant is 1,353 w/m2 with an error
of + 21 watts/m2. The solar constant can be calculated from the blackbody
temperature of the sun (T = 5,800oK) and the mean angular radius of the sun from
the earth (4.6 x 10-3 radians).
Interaction of EMR with the Earth's Surface
Radiation from the sun, when incident upon the earth's surface, is either reflected
by the surface, transmitted into the surface or absorbed and emitted by the
surface. The EMR, on interaction, experiences a number of changes in magnitude,
direction, wavelength, polarization and phase. These changes are detected by the
remote sensor and enable the interpreter to obtain useful information about the
object of interest. The remotely sensed data contain both spatial information
(size, shape and orientation) and spectral information (tone, color and spectral
signature).
From the viewpoint of interaction mechanisms, the object (visible and infrared)
wavelengths from 0.3 m to 16 m can be divided into three regions. The
spectral band from 0.3 m to 3 m is known as the reflective region. In this band,
the radiation sensed by the sensor is that due to the sun, reflected by the earth's
surface. The band corresponding to the atmospheric window between 8 m and
14 m is known as the thermal infrared band. The energy available in this band
for remote sensing is due to thermal emission from the earth's surface. Both
reflection and self-emission are important in the intermediate band from 3 m to
5.5 m.
In the microwave region of the spectrum, the sensor is radar, which is an active
sensor, as it provides its own source of EMR. The EMR produced by the radar is
transmitted to the earth's surface and the EMR reflected (back scattered) from
the surface is recorded and analyzed. The microwave region can also be

12
monitored with passive sensors, called microwave radiometers, which record the
radiation emitted by the terrain in the microwave region.
Reflection
Of all the interactions in the reflective region, surface reflections are the most
useful and revealing in remote sensing applications. Reflection occurs when a ray
of light is redirected as it strikes a non-transparent surface. The reflection
intensity depends on the surface refractive index, absorption coefficient and the
angles of incidence and reflection.

Figure 8. Different types of scattering surfaces (a) Perfect specular reflector (b)
Near perfect specular reflector (c) Lambertain (d) Quasi Lambertian (e)
Complex.

Transmission
Transmission of radiation occurs when radiation passes through a substance
without significant attenuation. For a given thickness, or depth of a substance, the
ability of a medium to transmit energy is measured as transmittance ().
 = Transmitted radiation

Incident radiation

13
Figure.9 Interaction of Energy with the earth’s surface.

Spectral Signature
Spectral reflectance, (()), is the ratio of reflected energy to incident energy as a
function of wavelength. Various materials of the earth's surface have different
spectral reflectance characteristics. Spectral reflectance is responsible for the
color or tone in a photographic image of an object. Trees appear green because
they reflect more of the green wavelength. The values of the spectral reflectance
of objects averaged over different, well-defined wavelength intervals comprise
the spectral signature of the objects or features by which they can be
distinguished. To obtain the necessary ground truth for the interpretation of
multispectral imagery, the spectral characteristics of various natural objects have
been extensively measured and recorded.
The spectral reflectance is dependent on wavelength, it has different values at
different wavelengths for a given terrain feature. The reflectance characteristics
of the earth’s surface features are expressed by spectral reflectance, which is
given by:

() = ( ER() / EI() ) x 100

14
Where,
() = Spectral reflectance (reflectivity) at a particular wave length.
ER() = Energy of wavelength reflected from object
EI() = Energy of wavelength incident upon the object
The plot between () and  is called a spectral reflectance curve. This
varies with the variation in the chemical composition and physical conditions of
the feature, which results in a range of values. The spectral response patterns are
averaged to get a generalized form, which is called generalized spectral response
pattern for the object concerned. Spectral signature is a term used for unique
spectral response pattern, which is characteristic of a terrain feature. Figure 10
shows a typical reflectance curves for three basic types of earth surface features,
healthy vegetation, dry bare soil (gray-brown and loamy) and clear lake water.

Figure 10. Typical Spectral reflectance curves for vegetation, soil and
water

Reflectance Characteristics of Earth's Cover types

15
The spectral characteristics of the three main earth surface features are discussed
below

Vegetation:

The spectral characteristics of vegetation vary with wavelength. A compound in


leaves called chlorophyll strongly absorbs radiation in the red and blue
wavelengths but reflects green wavelength. The internal structure of healthy
leaves acts as diffuse reflector of near infrared wavelengths. Measuring and
monitoring the near infrared reflectance is one way that scientists determine how
healthy particular vegetation may be.

Water:

Majority of the radiation incident upon water is not reflected but is either
absorbed or transmitted. Longer visible wavelengths and near infrared radiation
is absorbed more by water than by the visible wavelengths. Thus water looks blue
or blue green due to stronger reflectance at these shorter wavelengths and darker
if viewed at red or near infrared wavelengths. The factors that affect the
variability in reflectance of a water body are depth of water, materials within
water and surface roughness of water.

Soil:

The majority of radiation incident on a soil surface is either reflected or absorbed


and little is transmitted. The characteristics of soil that determine its reflectance
properties are its moisture content, organic content, texture, structure and iron
oxide content. The soil curve shows less peak and valley variations. The presence
of moisture in soil decreases its reflectance.

By measuring the energy that is reflected by targets on earth's surface over a


variety of different wavelengths, we can build up a spectral signature for that
object. And by comparing the response pattern of different features we may be
able to distinguish between them, while we might not be able to if we only
compared them at one wavelength. For example Water and Vegetation reflect
somewhat similarly in the visible wavelength but not in the infrared.

16
Interactions with the Atmosphere
The sun is the source of radiation, and electromagnetic radiation (EMR) from the
sun that is reflected by the earth and detected by the satellite or aircraft-borne
sensor must pass through the atmosphere twice, once on its journey from the sun
to the earth and once after being reflected by the surface of the earth back to the
sensor. Interactions of the direct solar radiation and reflected radiation from the
target with the atmospheric constituents interfere with the process of remote
sensing and are called as “Atmospheric Effects".
The interaction of EMR with the atmosphere is important to remote sensing for
two main reasons. First, information carried by EMR reflected/emitted by the
earth's surface is modified while traversing through the atmosphere. Second, the
interaction of EMR with the atmosphere can be used to obtain useful information
about the atmosphere itself.
The atmospheric constituents scatter and absorb the radiation modulating the
radiation reflected from the target by attenuating it, changing its spatial
distribution and introducing into field of view radiation from sunlight scattered in
the atmosphere and some of the energy reflected from nearby ground area. Both
scattering and absorption vary in their effect from one part of the spectrum to the
other.
The solar energy is subjected to modification by several physical processes as it
passes the atmosphere viz.
1. Scattering
2. Absorption
3. Refraction
Atmospheric Scattering.
Scattering is the redirection of EMR by particles suspended in the atmosphere or
by large molecules of atmospheric gases. Scattering not only reduces the image
contrast but also changes the spectral signature of ground objects as seen by the
sensor. The amount of scattering depends upon the size of the particles, their
abundance, the wavelength of radiation, depth of the atmosphere through which
the energy is traveling and the concentration of the particles. The concentration of
17
particulate matter varies both in time and over season. Thus the effects of
scattering will be uneven spatially and will vary from time to time.

Figure 11 : Scattering(Source: CCRS web Tutorial)

Theoretically scattering can be divided into three categories depending upon the
wavelength of radiation being scattered and the size of the particles causing the
scattering. The three different types of scattering from particles of different sizes
are summarized below:

Scattering Wavelengt Approximate Kinds


process h dependence of particles
particle size
Selective
i. Rayleigh  -4 < 1 m Air molecules
ii. Mie  o to -4 0.1 to 10 m Smoke, haze
iii. Non- o > 10 m Dust, fog,
selective clouds

Rayleigh Scattering

Rayleigh scattering predominates where electromagnetic radiation interacts with


particles that are smaller than the wavelength of the incoming light. The effect of
18
the Rayleigh scattering is inversely proportional to the fourth power of the
wavelength. Shorter wavelengths are scattered more than longer wavelengths. In
the absence of these particles and scattering the sky would appear black. In the
context of remote sensing the Rayleigh scattering is the most important type of
scattering. It causes a distortion of spectral characteristics of the reflected light
when compared to measurements taken on the ground.

Mie Scattering

Mie scattering occurs when the wavelength of the incoming radiation is similar in
size to the atmospheric particles. These are caused by aerosols: a mixture of
gases, water vapour and dust. It is generally restricted to the lower atmosphere
where the larger particles are abundant and dominates under overcast cloud
conditions. It influences the entire spectral region from ultra violet to near
infrared regions.

Non-selective Scattering

This type of scattering occurs when the particle size is much larger than the
wavelength of the incoming radiation. Particles responsible for this effect are
water droplets and larger dust particles. The scattering is independent of the
wavelength, all the wavelength are scattered equally. The most common example
of non-selective scattering is the appearance of clouds as white. As cloud consist
of water droplet particles and the wavelength are scattered in equal amount, the
cloud appears as white.

Occurrence of this scattering mechanism gives a clue to the existence of large


particulate matter in the atmosphere above the scene of interest which itself is a
useful data. The effects of the Rayleigh component of scattering can be eliminated
by using minus blue filters. However, the effects of heavy haze, when all the
wavelengths are scattered uniformly, cannot be eliminated by using haze filters.
The effects of haze are less pronounced in the thermal infrared region.
Microwave radiation is completely immune to haze and can even penetrate
clouds.

19
Atmospheric Absorption
The gas molecules present in the atmosphere strongly absorb the EMR passing
through the atmosphere in certain spectral bands. Mainly three gases are
responsible for most of absorption of solar radiation viz. ozone, carbon dioxide
and water vapor. Ozone absorbs the high energy, short wavelength portions of
the ultraviolet spectrum ( < 0.24m) thereby preventing the transmission of this
radiation to the lower atmosphere. Carbon dioxide is important in remote sensing
as it effectively absorbs the radiation in mid and far infrared regions of the
spectrum. It strongly absorbs in the region from about 13- 17.5 m, whereas two
most important regions of water vapour absorption are in bands 5.5 - 7.0 m and
above 27 m. Absorption relatively reduces the amount of light that reaches our
eye making the scene look relatively duller.

Figure 12 : Absorption (Source: CCRS web Tutorial)

Atmospheric Windows
The general atmospheric transmittance across the whole spectrum of wavelengths
is shown in figure 13. The atmosphere selectively transmits energy of certain
wavelengths. The spectral bands for which the atmosphere is relatively
transparent are known as atmospheric windows. Atmospheric windows are
present in the visible part (.4 m - .76 m) and the infrared regions of the EM
spectrum. In the visible part transmission is mainly effected by ozone absorption
and by molecular scattering. The atmosphere is transparent again beyond about
= 1mm, the region used for microwave remote sensing.

20
1.0
transmittance
atmospheric

blocking effect of atmosphere

microwaves
MIR
NIR

MIR

TIR
VIS

TIR
UV

0.0
0.3 0.6 1.0 5.0 10 50 100 200 m 1mm 1cm 1m 10m
wavelength

Figure 13 : Atmospheric windows

Refraction
The phenomenon of refraction that is bending of light at the contact between two
media also occurs in the atmosphere as the light passes through the atmospheric
21
layers of varied clarity, humidity and temperature. These variations influence the
density of atmospheric layers, which in turn causes the bending of light rays as
they pass from one layer to another. The most common phenomena are the
mirage like apparitions sometimes visible in the distance on hot summer days.
Remote Sensing Systems
The common remote sensing systems are of two types, Imaging (Image forming)
and Non Imaging (non image forming).
Image forming systems are again of two types - framing type and scanning type. In
the framing type, entire frame of image is acquired instantaneously in the basic
image unit e.g. in a frame camera used in photography. In scanning type, the
information is acquired sequentially from the surface in bits of picture elements or
pixels, point by point and line by line, which may be arranged after acquisition into
a frame format.
Non imaging type of sensors, are used to record a spectral quantity or a
parameter as a function of time or distance (such as Gamma radiation, magnetic
field, temperature measurement etc.) They are mostly used for ground
observation and in study of atmosphere and meteorology. These sensors do not
form image and as such, are not used in operational remote sensing but give
detailed information on spectral characteristics of the target.
For a remote sensing system, it needs the radiant energy to be reflected or
emitted by the object or target, which must reach the sensor /detector of
recording system. The response of the detector to the incident energy is recorded
as data or image, which is analyzed to derive the information about object.
Remote sensing can be either passive or active. ACTIVE systems have their own
source of energy (such as RADAR) whereas the PASSIVE systems depend upon
external source of illumination (such as SUN) or self emission for remote sensing.
The most important component of a remote sensing system is the sensor
/detector which, records the variation of radiant energy reflected or emitted by
objects or surface material. Different types of sensors are available, which are
sensitive to different parts of the electromagnetic spectrum. These sensors fall in
two broad categories - image forming and non image forming type of sensors.
The function of recording system is to convert the energy detected by sensor into
a form, which can be perceived. For example in photography, measurement or
22
detection is done by camera lens and recording by the film. Since, the
photographic system uses the visible part of spectrum the recording can be easily
perceived. But in case of mechanical optical scanner, which can collect energy
beyond the visible part of spectrum, there is need to convert the measured energy
into a form, which can be perceived. This is done by dividing the incoming energy
by beam splitters and filters into different wavelength bands and then converting
energy in each wavelength band into electrical signal. The electrical signal is
processed to give radiometric data for each band, which is recorded on magnetic
tapes.
Regions of electromagnetic spectrum, which are of primary concern in remote
sensing, are
1. Optical wavelength (Visible, Near IR, Middle IR) - .3 m - 16 m.
Sensors, which operate in this region, are:
Aerial cameras : 0.38 m to 0.9 m
Thermal scanners : 3 m to 5 m
: 8 m to 16 m
Multispectral scanner : 0.3 m to 1.1 m
Vidicon / R.B.V. : 0.3 m to 1.1 m

2. Microwave wavelengths : 1mm to 1 meter


Sensors which operate in these wavelengths / frequencies are mostly active
systems like RADAR.

23
PLATFORMS AND SENSORS

Introduction
Remote sensing is defined as the science which deals with obtaining information about
objects on earth surface by analysis of data, received from a remote platform.
In the present context, information flows from an object to a receiver (sensor) in the
form of radiation transmitted through the atmosphere. The interaction between the
radiation and the object of interest conveys information required on the nature of the
object. In order for a sensor to collect and record energy reflected or emitted from a
target or surface, it must reside on a stable platform away from the target or surface
being observed.
Platforms
Platform is a stage to mount the camera or sensor to acquire the information about a
target under investigation. Based on its altitude above earth surface, platforms may be
classified as
(1) Ground borne
(2) Air borne
(3) Space borne
Ground-based platforms
The ground based remote sensing system for earth resources studies are mainly used
for collecting the ground truth or for laboratory simulation studies.
Air-borne platforms
Aircraft’s are generally used to acquire aerial photographs for photo-interpretation and
photogrammetric purposes. Scanners are tested against their utility and performance
from these platforms before these are flown onboard satellite missions.
Space-borne platforms
Platforms in space are not affected by the earth's atmosphere. The closed path of a
satellite around the earth is called its orbit. These platforms are freely moving in their
orbits around the earth, and entire earth or any part of the earth can be covered at
specified intervals. The coverage mainly depends on the orbit of the satellite. It is
through these space borne platforms, we get the enormous amount of remote sensing
data.

1
Types of Satellite orbits
Satellite orbits are designed according to the capability and objective of the sensors they
carry. Depending on their altitude, orientation and rotation relative to the earth
satellites can be categorized as:
(1) Geostationary (2) Polar orbiting and Sun-synchronous
Geostationary satellites
An equatorial west to east satellite orbiting the earth at an altitude of 35000 km, the
altitude at which it makes one revolution in 24 hours, synchronous with the earth's
rotation. These platforms are covering the same place and give continuous near
hemispheric coverage over the same area day and night. These satellites are put in
equatorial plane orbiting from west to east. Its coverage is limited to 70 oN to 70oS
latitudes and one satellite can view one-third globe (Fig 1). These are mainly used for
communication and meteorological applications viz. GOES METEOSAT, INTELSAT, and
INSAT satellites.

Fig. 1 Geo-stationary Orbit (source CCRS Website)

Sun-synchronous satellites
An earth satellite orbit in which the orbital plane is near polar and the altitude is such
that the satellite passes over all places on earth having the same latitude twice in each
orbit at the same local sun-time. Fig 2. This ensures similar illumination conditions when
acquiring images over a particular area over a series of days.

2
Fig 2. Sun synchronous orbit (source CCRS Website)

As the satellite orbits the Earth from pole to pole, its east-west position would not
change if the Earth did not rotate. However, as seen from the Earth, it seems that the
satellite is shifting westward because the Earth is rotating (from west to east) beneath
it. This apparent movement allows the satellite swath to cover a new area with each
pass (Fig. 3). The satellite's orbit and the rotation of the Earth work together to allow
complete coverage of the Earth's surface, after it has completed one complete cycle of
orbits (Fig. 4). Through these satellites the entire globe is covered on regular basis and
gives repetitive coverage on periodic basis. All the remote sensing resource satellites
may be grouped in this category. Few of these satellites are LANDSAT series, SPOT
series, IRS series, NOAA, SEASAT, TIROS, HCMM, SKYLAB, SPACE SHUTTLE etc.

Fig 3 Area Coverage on each Consecutive pass

3
Fig 4 Complete Coverage of Earth Surface by Sun Synchronous Satellites (source CCRS
website)

Satellite orbital characteristics


Altitude: It is the distance (in Km) from the satellite to the mean surface level of the
earth. The satellite altitude influences the spatial resolution to a large extent.
Inclination angle: The angle (in degrees) between the orbit and the equator. The
inclination angle of the orbit determines the field of view of the sensor and which
latitudes can be observed. If the inclination angle is 60° then the satellite flies over the
earth between the latitudes 60° South and 60° North, it cannot observe parts of the
earth above 60° latitude.
Period: It is the time (in minutes) required to complete one full orbit. A polar satellite
orbiting at an altitude of 800km has a period of 90mins.
Repeat Cycle: It is the time (in days) between two successive identical orbits.
Swath: As a satellite revolves around the Earth, the sensor sees a certain portion of the
Earth's surface. The area is known as swath. The swath for satellite images is very large
between tens and hundreds of kilometers wide.

4
Ascending pass and Descending pass: The near polar satellites travel northward on one
side of the earth (ascending pass) and towards South Pole on the second half of the
orbit (descending pass). The ascending pass is on the shadowed side while the
descending pass is on the sunlit side. Optical sensors image the surface on a descending
pass, while active sensors and emitted thermal and microwave radiation can also image
the surface on ascending pass.
Perigee: It is the point in the orbit where an earth satellite is closest to the earth.
Apogee: It is the point in the orbit where an earth satellite is farthest from the earth.
Remote Sensing Sensors
Sensor is a device that gathers energy (EMR or other), converts it into a signal and
presents it in a form suitable for obtaining information about the target under
investigation. These may be active or passive depending on the source of energy.
Sensors used for remote sensing can be broadly classified as those operating in Optical
Infrared (OIR) region and those operating in the microwave region. OIR and microwave
sensors can further be subdivided into passive and active.
Active sensors use their own source of energy. Earth surface is illuminated through
energy emitted by its own source; a part of it is reflected by the surface in the direction
of the sensor, which is received to gather the information. Passive sensors receive solar
electromagnetic energy reflected from the surface or energy emitted by the surface
itself. These sensors do not have their own source of energy and cannot be used at
nighttime, except thermal sensors. Again, sensors (active or passive) could either be
imaging, like camera or sensor, which acquire images of the area and non-imaging types
like non-scanning radiometer or atmospheric sounders.
Instantaneous field of view (IFOV)
It is defined the solid angle through which a detector is sensitive to radiation (units is
mrad). It is defined as angular subtence at a given instant of the limiting detector
aperture at the second principal point of the system. IFOV is both a linear and angular
quantity.
IFOV = D/F radian
GRE = (D/F) x H meter
Where,
D=detector dimension, F=focal length, and H=flying height

5
sensor
radiometer

  = angular
aperture (mrad)

ground resolution element


(resolution cell) D

Fig. 5 IFOV

Resolution
Resolution is defined as the ability of the system to render the information at the
smallest discretely separable quantity in terms of distance (spatial), wavelength band of
EMR (spectral), time (temporal) and/or radiation quantity (radiometric).
Spatial Resolution
Spatial resolution is the projection of a detector element or a slit onto the ground. In
other words, scanner's spatial resolution is the ground segment sensed at any instant. It
is also called ground resolution element (GRE).
Ground Resolution = H x IFOV
The spatial resolution at which data are acquired has two effects –the ability to identify
various features and quantify their extent. The former one relates to the classification
accuracy and the later to the ability to accurately make mensuration. One important
aspect in classification accuracy is the contribution of boundary pixels. As the resolution
improves, pure center pixels of a feature increase in comparison to boundary pixels.
Thus the boundary error gets reduced with improved resolution.

6
The accuracy of measurement of an area will depend upon the accuracy of locating the
boundary. Since it is not possible to locate with accuracy better than a fraction of a
pixel, the larger the pixel size, the more error will be the error in the area estimation.
Images where only large features are visible are said to have coarse or low resolution. In
fine resolution images, small objects can be detected.

Fig. 6 10 meter resolution 30 meter resolution 80 meter resolution

Spectral Resolution
Spectral emissivity curves, which characterize the reflectance and/or emittance of a
feature or target, over a variety of wavelengths. Different classes of features and details
in an image can be distinguished by comparing their responses over distinct wavelength
ranges. Broad classes such as water and vegetation can be separated using broad
wavelength ranges (VIS, NIR), whereas specific classes like rock types would require a
comparison of fine wavelength ranges to separate them. Hence spectral resolution
describes the ability of the sensor to define fine wavelength intervals i.e. sampling the
spatially segmented image in different spectral intervals, thereby allowing the spectral
irradiance of the image to be determined.
The selection of spectral band location primarily depends on the feature characteristics
and atmospheric absorption.
Radiometric Resolution
This is a measure of the sensor to differentiate the smallest change in the spectral
reflectance/emittance between various targets. It is normally defined as the noise
equivalent reflectance change NE or noise equivalent temperature NET.
The radiometric resolution depends on the saturation radiance and the number of
quantisation levels. Thus, a sensor whose saturation is set at 100 reflectance with an

7
8 bit resolution will have a poor radiometric sensitivity compared to a sensor whose
saturation radiance is set at 20 reflectance and 7 bit digitization.
Temporal Resolution
Obtaining spatial and spectral data at certain time intervals. Temporal resolution is also
called as the repetivity of the satellite; it is the capability of the satellite to image the
exact same area at the same viewing angle at different periods of time. The temporal
resolution of a sensor depends on a variety of factors, including the satellite/sensor
capabilities, the swath overlap and latitude. It is an important aspect in remote sensing
when
 persistent cloud offers limited clear views of the earth’s surface
 short lived phenomenon need to be imaged (flood, oil slicks etc.)
 multi temporal comparisons are required (agriculture application)
 the changing appearance of a feature over time can be used to distinguish it
from near similar features (wheat/maize)

Multispectral scanning principle

Cameras and their use for aerial photography are the simplest and oldest of sensors
used for remote sensing of the Earth's surface. Cameras are framing systems (figure 7a),
which acquire a near-instantaneous "snapshot" of an area of the Earth’s surface.
Camera systems are passive optical sensors that use a lens (or system of lenses
collectively referred to as the optics) to form an image at the focal plane, the “aerial
image plane” at which an image is sharply defined.

Many electronic (as opposed to photographic) remote sensors acquire data using
scanning systems, which employ a sensor with a narrow field of view that sweeps over
the terrain to build up and produce a two-dimensional image of the surface. Scanning
systems can be used on both aircraft and satellite platforms and have essentially the
same operating principles. A scanning system used to collect data over a variety of
different wavelength ranges is called a multispectral scanner (MSS), and is the most
commonly used scanning system. There are two main modes or methods of scanning
employed to acquire multispectral image data - across-track scanning, and along-track
scanning.

8
Analogue Recording
(a) analogue recording Digital
(b) digital Recording
recording Digital
(c) digital Recording
recording

aerial image plane imaging line array detector


optics
shutter scanning mirror

imaging optics “point” detector imaging optics

Camera
(aerial photography) Whiskbroom scanner Pushbroom scanner
Camera Whiskbroom Scanner Pushbroom Scanner
(Aerial Photography)

Figure 7: Principle of imaging sensor systems; (a) framing system, (b)


whiskbroom scanner, (c) pushbroom scanner. (source :http://cgi.girs.wageningen-
ur.nl/igi-new)

Across-Track Multispectral Scanning


Sensor system builds up two-dimensional images of the terrain for a swath beneath the
aircraft. There are two different ways in which this can be done - using across-track
(whiskbroom) scanning or along-track (push broom) scanning.
Fig. 7b illustrates the operation of an across-track, or whiskbroom scanner. Using a
rotating or oscillating mirror, such systems scan the terrain along scan lines that are
right angles to the flight line. This allows the scanner to repeatedly measure the energy
from one side of the platform to the other. Data are collected within an arc below the
system typically of some 90o to 120o. Successive scan lines are covered as the flight
moves forward, yielding a series of contiguous, or just touching, narrow strips of
observation composing a two-dimensional image (very similar to the individual lines
used to produce a picture on a television screen). The incoming energy is separated into
several spectral components that are sensed independently. The non-thermal
wavelength component is directed from the grating through a prism (or diffraction
grating) that splits the energy into a continuum of UV, visible, and near-IR wavelength.
At the same time, the dichotic grating disperses the thermal component of the
oncoming signal into its constituent wavelengths. By placing an array of electro-optical
detectors at the proper geometric positions behind the grating and the prism, the
incoming beam is "pulled apart" into multiple narrow bands, or channels, each of which

9
is measured independently. Each detector is designed to have its peak spectral
sensitivity in a specific wavelength band.
The electrical signals generated by each of the detectors of the MSS are amplified by the
system electronics and recorded by a multi-channel tape recorder. Usually, on board
signal conversion is used to record the data digitally for subsequent computer
processing on the ground. Subsets of the data can also be viewed in-flight on a monitor
to verify flight line coverage and to provide a real time interpretation capability of the
scene being recorded.
Along-Track Multispectral Scanning
As with across-track systems, along track or push broom scanners record multispectral
image data along a swath beneath an aircraft. Also similar is the use of the forward
motion of the aircraft to build up a two-dimensional image by recording successive scan
lines that are oriented at right angles to the flight direction. However, there is a distinct
difference between along-track and across-track systems in the manner in which each
scan line is recorded. In an along-track system there is no scanning mirror. Instead, a
linear array of detectors is used to "scan" in the direction parallel to the flight line
(Figure 7c). Linear arrays normally consist of numerous charge-coupled devices (CCDs)
positioned end to end. As illustrated in Figure 7c each detector element is dedicated to
sensing the energy in a single ground resolution cell along any given scan line. The data
for each scan line are electronically compiled by sampling each element along the array
(eliminating the need for a scanning mirror).
The size of the detectors comprising a linear array determines the size of each ground
resolution cell. Hence, CCDs are designed to be very small and a single array may
contain over 10,000 individual detectors. Each spectral band, or channel, of sensing
requires its own linear array. Normally, the arrays are located in the focal plane of the
scanner such that all scan lines are viewed by all arrays simultaneously.
Linear array systems afford a number of advantages over mirror scanning systems. First,
linear arrays provide the opportunity for each detector to have a longer dwell time, or
residence time, to measure the energy from each ground resolution cell. This enables a
much stronger signal to be recorded and a greater range in the levels of signal that can
be sensed. This leads to better spatial and radiometric resolution. In addition, the
geometric integrity of linear array systems is greater because of the fixed relationship
among detector elements recording each scan line. The geometry along each scan line
is similar to that characterizing an aerial mapping camera. Because CCDs are solid-state
microelectronics devices, they are generally smaller in size and weight and require less
power for their operation. Having no moving parts, a linear array system has higher
reliability and longer life expectancy. (Due to such advantages, CCDs are used
extensively in satellite remote sensing systems.)
One disadvantage to push broom systems is the need to calibrate many more detectors.
Another current limitation to commercially available CCDs is their relatively limited

10
range of spectral sensitivity. Charge-coupled detectors are not really available that are
sensitive to wavelengths longer than the near-IR. However, detectors capable of
operating at longer wavelengths are under development.
Optical Sensors
Data products obtained by various scanner/detector/recorder combinations in analogue
or digital form fall in this class. Scanner systems working beyond the visible and near
infrared range of the electromagnetic spectrum, in thermal and microwave region
(RADAR) are all non-photographic systems. Such data is collected by sensor system in
satellite and transmitted to earth, where it is received and recorded at Ground Station.
Thermal Scanners
Many multispectral (MSS) systems sense radiation in the thermal infrared as well as the
visible and reflected infrared portions of the spectrum. However, remote sensing of

different from the sensing of reflected energy. Thermal sensors use photo detectors
sensitive to the direct contact of photons on their surface, to detect emitted thermal
radiation. The detectors are cooled to temperatures close to absolute zero in order to
limit their own thermal emissions. Thermal sensors essentially measure the surface
temperature and thermal properties of targets.

Thermal imagers are typically across-track scanners that detect emitted radiation in only
the thermal portion of the spectrum. Thermal sensors employ one or more internal
temperature references for comparison with the detected radiation, so they can be
related to absolute radiant temperature. The data are generally recorded on film and/or
magnetic tape and the temperature resolution of current sensors can reach 0.1 °C. For
analysis, an image of relative radiant temperatures is depicted in grey levels, with
warmer temperatures shown in light tones, and cooler temperatures in dark tones.
In a thermal image, the tone of an object is a function of its surface temperature and its
emissivity. Of these parameters, the surface temperature is the dominant factor for
producing tonal variations in the scene. All objects emit infrared radiation and the
amount of emitted radiation is a function of surface temperature. Hot bodies appear in
lighter tone in a thermal image and cooler bodies appear darker. The emitted radiation
are collected by thermal scanner, which works on the principle of Optical Mechanical
Scanner, and cryogenically cooled detectors are employed to sense the radiation in the
wavelength of 8 to 14 µm wavelength. Temperature variations of upto one degree
centigrade can be estimated from the thermal imagery.

11
Table 1 Thermal sensors

HCMM TM
Operational period 1978-1980 1982 to present
Orbital altitude 620 Km 705 Km
Image coverage 700 by 700 Km 185 by 170 Km
Acquisition time, day 1:30 p.m. 10:30 a.m.
Acquisition time, night 2:30 a.m. 9:30 p.m.
Visible and reflected IR detectors
Number of bands 1 6
spectral range 0.5 0 - 1.1µm 0.4 - 2.35 µm
Ground resolution cell 500 by 500 m 30 by 30 m
Thermal IR detector
Spectral range 10.5 - 12.5 µm 10.5 - 12.5µm
Ground resolution cell 600 by 600 m 120 by 120m

Microwave sensing radar


Microwave data can be obtained by both active and passive systems. Passive system
monitor natural radiation at a particular frequency or range of frequency. Data may be
presented numerically as line trace data or as imagery. Active systems (like SLAR and
SAR) transmit their own energy and monitor the returned signal.
Characteristics of such radar imagery both in SAR and SLAR and their resolution depends
on various parameters like frequency of the signal, look direction, slant range, dielectric
constant of the objects, phase, antenna length etc. Spatial resolution in range and
azimuth direction varies in different manners.
RADAR (SAR) imageries have been obtained from satellite SEASAT, ERS and space
shuttle missions SIR-A, SIR-B and SIR-C using synthetic aperture radar, which have all
weather capability. Such data products are useful for studies in cloud-covered region of
the earth and in oceanography.

12
Table 2: Microwave Sensors

Seasat SIR-C/X- ESA RADARSAT ENVISAT JERS-1


SAR SAR SAR SAR ASAR
Frequency 5.3 GHz 5.3 5.3 GHz 5.33 GHz 1.275
1.275 1.275 GHz GHz GHz
GHz
Wave L band X band C C band L Band
length 23 cm 3cm band C Band (23cm)
C band
6cm
L band 23
cm
Swath 100 Km, 15 to 90 100 45-510 Km 5 Km – 75 km
Width centered Km Km Varies 100 Km
20o off Depend on Varies
nadir orientation
is antenna
Ground 25 x 25 10 to 200m 30 m 100x100 m 30 m
Resolution m to 9x9m Varies
varies

Satellite Missions
Today more than ten E.O. satellites provide imagery that can be used in various
applications. The list also includes some failed as well as future missions. Agencies
responsible for the distribution and trading of data internationally are also listed.

13
Table-3 Operational Earth Observation Satellites
EUROPE MIDDLE NORTH AMERICA ASIA
EAST
France ESA Israel USA Canada India Japan
SPOT1-86 LANDSAT5-85
S 10m 30m
A SPOT2-90 ERS1- LANDSAT6-93
T 10m 92/00
E radar
LL SPOT3- ERS2-95 EARLYBIRD-98 IKONOS1-99 RADARSAT- IRS1C-95
I 93/96 radar 1m 95 6m
T SPOT4-98 ENVISAT- LANDSAT7-99 IKONOS2-99 IRS1D-97
E 10m 2001 15m 1m 6m
S Radar
EROS A/1- QUICKBIRD-01 ORBVIEW-01 IRS P6-2003
00 2m 0.6m 1m 5.8m
SPOT5-02 EROS B/1- ORBVIEW-02 RADARSAT- CARTOSAT- ALOS-
3m+HRS10 02 1m 1m 03 1& 2 03
2.5m/80cm 2.5m
Distribution
SPOT Miscellaneo Imagesat SI-EOSAT, Earthwatch, RADARSAT NRSA- Jaxa
IMAGING us Orbimage, USGS EOSAT

Landsat Series of Satellites


NASA, with the co-operation of the U.S. Department of Interior, began a conceptual
study of the feasibility of a series of Earth Resources Technology Satellites (ERTS). ERTS-
1 was launched in July 23, 1972, and it operated until January 6, 1978. It represented
the first unmanned satellite specifically designed to acquire data about earth resources
on a systematic, repetitive, medium resolution, multispectral basis. It was primarily
designed as an experimental system to test the feasibility of collecting earth resources
data from unmanned satellites. About 300 individual ERTS-1 experiments were
conducted in 43 US states and 36 nations. Just prior to the launch of ERTS-B on January
22nd 1975, NASA officially renamed the ERTS programme as the "LANDSAT"
programme. All subsequent satellites in the series carried the Landsat designation. So
far six Landsat satellites have been launched successfully, while Landsat-6 suffered
launch failure. Table 4 highlights the characteristics of the Landsat series satellites.
There have been four different types of sensors included in various combinations on
these missions. These are Return Beam Vidicon camera (RBV) systems, Multispectral
Scanner (MSS) systems, Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM).

14
After more than two decades of success, the Landsat program realised its first
unsuccessful mission with the launch failure of Landsat-6 on October 5, 1993. The
sensor included on-board was the Enhanced Thematic Mapper (ETM). To provide
continuity with Landsat -4 and -5 the ETM incorporated the same seven spectral bands
and the same spatial resolutions as the TM. The ETM's major improvement over the TM
was addition of an eighth panchromatic band operating in 0.50 to 0.90-µm range and
spatial resolution of 15m. Landsat-7 includes two sensors: the Enhanced Thematic
Mapper plus (ETM+) and the High Resolution Multispectral Stereo Imager (HRMSI).
Table-4 Characteristics of Landsat-1 to -7 Missions
Satellite Launc Decommissione RBV bands MSS bands TM bands
hed d
Landsat-1 July Jan. 6, 1978 1, 2, 3 4, 5, 6, 7 -
23, (simultaneous
1972 images)
Landsat-2 Jan. Feb. 25, 1982 1, 2, 3 4,5,6,7 -
22, (simultaneous
1975 images)
Landsat-3 March Mar. 31, 1983 A,B,C,D (One band 4,5,6,7,8* -
5, side by side
1978 images)
Landsat-4 July - 1,2,3,4,5,6,
16, 7
1982
Landsat-5 March Same as LANDSAT – 4
1,
1984
Landsat-6 Oct. 5, LAUNCH FAILURE
1993
Landsat-7 April - - 1,2,3,4,5,6,
15, 7,8
1999
Landsat-8 Februa OLI Bands
ry 11,
1-11
2013

15
Table-5 Orbital characteristics of Landsat series satellites

Features Landsat 4 &5 Landsat 7


Altitude 904km 705 Km
Orbital period 103.2(min) 98.9 min
Temporal resolution 22 days 16 days
Equatorial 10.00 am 10.00 am
crossing time (local sun time) (local time +/- 5 min)
Sensors TM ETM +
Swath (Km) 183 x 170 Km
Resolution (m) 72.5 30 m Multispectral
60 m Thermal
15 m Panchromatic

Sensors
(i) Multispectral Scanner (MSS) used in Landsat series satellites
Multispectral scanner (Optical Mechanical Scanner) onboard Landsat series of satellites
of U.S.A. (L1, L2, L3, L4 & L5) gives line scan type imagery using an oscillating mirror to
continuously scan the earth surface perpendicular to the spacecraft velocity. Six lines
are scanned simultaneously in each of the four spectral bands for each mirror sweep.
Spacecraft motion provides the along-track progression of the scan lines. Radiation is
sensed simultaneously by an array of six detectors each of four spectral bands from 0.5
to 1.1 µm. The detectors’ outputs are sampled, encoded and formatted into continuous
digital data stream.
(ii) Return Beam Vidicon (RBV) used in Landsat series satellites
Return Beam Vidicon onboard Landsat 1, 2 & 3 is a camera system, which operates by
shuttering 3 independent cameras (2 in case of L3) simultaneously, each sensing a
different spectral band in the range of 0.48 to 0.83 µm. The ground scene viewed (185
km x 185 km) is stored on the photosensitive surface of the camera tube and after
shuttering; the image is scanned by an electron beam to produce a video signal output.
In order to produce overlapping images along the direction of spacecraft motion, the
cameras are re-shuttered after every 25 seconds.
(iii) Thematic Mapper (TM) used in Landsat series satellites
Landsat 4 & 5 have onboard a new payload called “Thematic Mapper" with 7 spectral
bands & ground resolution of 30 meters. This is in addition to the MSS payload, which is
identical to those carried onboard Landsat 1 & 2 and replaces RBV payload. TM is also
an Optical Mechanical Scanner, similar to MSS; however, being a 2nd generation line
scanning sensor, it ensures better performance characteristics in terms of (i) improved
pointing accuracy and stability, (ii) high resolution, (iii) new and more number of

16
spectral bands, (iv) 16 days repetitive coverage (v) high scanning efficiency using bi-
directional scanning and (vi) increased quantization levels. For achieving the bi-
directional scanning, a scan line corrector (SLC) is introduced between the telescope and
focal plane. The SLC ensures parallel lines of scanning in the forward and reverse
direction.
Table-6 Sensor characteristics of Landsat series satellites

Spatial Scan Time


Sensor- Spectral resolution Orbital Operation
resolution width interval
system (µm) altitude period
(m) (km) Equator
Band 4: 0,5 - 0,6 79×79 Landsat 1
23/07/1972
Band 5: 0,6 - 0,7 79×79 -
06/01/1978
Band 6: 0,7 - 0,8 79×79 Landsat 2
22/01/1975
MSS 185 18 days 918 km
-
25/02/1982
Band 7: 0,8 - 1,1 79×79 Landsat 3
05/03/1978
-
30/11/1982
MSS As Landsat 3
Band 1: 0,45 - 0,52 30×30 Landsat 4
Band 2: 0,52 - 0,60 30×30 16/07/1982
Band 3: 0,63 - 0,69 30×30 - 02/1983
185 16 days 710 km
TM Band 4: 0,76 - 0,90 30×30 Landsat 5
Band 5: 1,55 - 1,75 30×30 01/03/1984
Band 6: 10,40 - 12,50 120×120 -
Band 7: 2,08 - 2,35 30×30
As Landsat 4-5 30x30
Landsat 7
Band 6: 10,40 - 12,50 60×60
ETM+ 185 16 days 705 Km 15/04/1999
Panchromatic: 0,50 - -
15×15
0,90

*Band 8 failed soon after the launch.

Landsat 8

Operational Land Imager (OLI)

17
Spectral Band Wavelength Resolution
Band 1 - Coastal / Aerosol 0.433 - 0.453 µm 30 m
Band 2 - Blue 0.450 - 0.515 µm 30 m
Band 3 - Green 0.525 - 0.600 µm 30 m
Band 4 - Red 0.630 - 0.680 µm 30 m
Band 5 - Near Infrared 0.845 - 0.885 µm 30 m
Band 6 - Short Wavelength Infrared 1.560 - 1.660 µm 30 m
Band 7 - Short Wavelength Infrared 2.100 - 2.300 µm 30 m
Band 8 - Panchromatic 0.500 - 0.680 µm 15 m
Band 9 - Cirrus 1.360 - 1.390 µm 30 m

Thermal InfraRed Sensor (TIRS)

Spectral Band Wavelength Resolution

Band 10 - Long Wavelength Infrared 10.30 - 11.30 µm 100 m

Band 11 - Long Wavelength Infrared 11.50 - 12.50 µm 100 m

Table 7 : OLI and TIRS sensor bands onboard Landsat 8

SPOT Series of Satellite


French Government in joint programme with Sweden and Belgium undertook the
development of Systeme Pour l'Observation de la Terre (SPOT) program. Conceived and
designed by the French Centre National d'Etudes Spatiales (CNES), SPOT has developed
into a large-scale international programme with ground receiving stations and data
distribution outlets located in more than 30 countries. It is also the first system to have
pointable optics. This enables side-to-side off-nadir viewing capabilities, and it affords
full scene stereoscopic imaging from two different satellite tracks permitting coverage
of the same area. SPOT-1 was retired from full-time services on December 31, 1990.
The SPOT-2 satellite was launched on January 21, 1990, and SPOT-3 was launched on
September 25, 1993 Spot 4 was launched on 26 March 1998. SPOT-1, -2 and -3 have
identical orbits and sensor systems, which are described in the Table 8.
SPOT-4 includes the additional 20m-resolution band in the mid-infrared portion of the
spectrum (between 1.58 and 1.75 µm). This band is intended to improve vegetation
monitoring and mineral discriminating capabilities of the data. Furthermore, mixed 20m
and 10m data sets will be co-registered on-board instead of during ground processing.
This will be accomplished by replacing the panchromatic band of SPOT-1, -2 and -3 (0.49

18
to 0.73 µm) with red band from these systems (0.61 to 0.68 µm). This band will be used
to produce both 10m black and white images and 20m multispectral data. Another
change in SPOT-4 is the addition of a separate wide-field-of-view, sensor called the
Vegetation Monitoring Instrument (VMI).

Table-8A Orbital characteristics of SPOT series satellites

Features SPOT 1, 2, and 3

Altitude 832
Orbital period (min.) 101
Inclination (degrees) 98.7
Equatorial crossing time 10.30 AM (local sun time)
Sensors HRV
Temporal resolution 26 days
(Repetivity)
Stereo viewing capability 5 days
Swath (km) 60
Resolution 20m MLA, 10m PLA

Table-8B: Characteristics of SPOT Satellites

Spectral Swath Revisit


Satellite No. of Resolution
Launch Sensors Types Range Width Time
Name Channels (meters)
(microns) (km) (Days)
June 30, PAN &
SPOT -7 1 0.45- 0.90 1.5 60
2014 MSS
Blue (0.455 –
0.525 )
PAN
Green (0.530
– 0.590 ) 1
September PAN & &
SPOT -6 4 Red (0.625 6 60
9, 2012 MSS Multispectral
– 0.695 )
Near-Infrared
(0.760 –
0.890 )

19
0.43-0.47
(blue)
0.61-
600 x
VMI Multispectral 4 0.68(red) 1000 1
120
0.78-0.89(
NIR) 1.58-
1.75(SWIR)
0.5-0.59
(green)
0.61-0.68 10
(red) 10
Multispectral 4 60
0.79-0.89 10
SPOT -5 May 2002
(NIR) 20
1.58-1.75
HRS
(SWIR)
5 m, 26
combined to
HRG
Pan 1 0.61-0.68 generate a 60
2.5-metre
product.
10 m
(resampled
Pan 1 0.61-0.68 60
at every 5m
along track)
Same as
VMI Multispectral 4 I000
SPOT 4
0.5-0.59
(green)
0.61-0.68
March 24,
SPOT-4 (red) 26
1998 Multispectral 4 20 60
HRV 0.79-0.89
(NIR)
1.58-1.75
(SWIR)
Pan 1 0.61-0.68 10 60
0.5-0.59
1990 &
SPOT-2 Multispectral 3 0.61-0.68 20 60
March HRV 26
&3 0.79-0.89
1998
Pan 1 0.51-0.73 10 60
Same as Spot
Multispectral 3 20 -do-
SPOT-1 1986 HRV 2 26
Pan 1 -do- 10 -do-

20
Sensors

High Resolution Visible (HRV) Imager used in SPOT Satellite


The French SPOT-1 spacecraft carries two nominally identical High Resolution Visible
(HRV) imagers, which can be operated independently or in various coupled modes. In
contrast to the oscillating mirror design used in the Landsat imaging system, HRV
cameras use Charge Coupled Devices (CCD) array as the sensing element for the first
time in space environment. Each of the two cameras can be operated in either
multispectral (20 m resolution) mode or panchromatic (10 m resolution) mode. The
swath covered is 60 Km; and the cameras can be tilted offset upto 27 on either side of
Nadir. Thus any point within a width of 950 km., centered on the satellite track can be
observed by programmed camera control. SPOT-1 has stereo coverage capability in
orbit with tiltable cameras, which again provides stereo image pair almost similar to
metric camera air photos, for the first time in space environment.
IRS Satellite Series
The Indian Space programme has the goal of harnessing space technology for
application in the areas of communications, broadcasting, meteorology and remote
sensing. The important milestones crossed so far are Bhaskara-1 and 2 (1979) the
experimental satellites, which carried TV Cameras and Microwave Radiometers. The
Indian Remote Sensing Satellite was the next logical step towards the National
operational satellites, which directly generates resources information in a variety of
application areas such as forestry, geology, agriculture and hydrology. IRS -1A/1B,
carried Linear Imaging Self Scanning sensors LISS-I & LISS-II. IRS-P2 was launched in
October 1994 on PSLV-D2, an indigenous launch vehicle. IRS-1C was launched on
December 28, 1995, which carried improved sensors like LISS-III, WiFS, PAN Camera, etc.
Details of IRS series platforms are given in the following section. IRS-P3 was launched
into the sun synchronous orbit by another indigenous launch vehicle PSLV - D3 on
21.3.1996 from Indian launching station Sriharikota (SHAR). IRS-1D was launched on 29
September, 1997 and IRS-P4 was launched on 26 – 5 1999 onboard PSLV from
Shriharikota.

21
Table-9 Orbital characteristics of IRS series satellites

Features IRS-1A/1B IRS-P2


Altitude 904km 817km
Orbital period 103.2(min) 101.35(min)
Temporal resolution 22 days 24 days
Equatorial crossing time 10.00 AM 10.00 AM
(local sun time)
Sensors LISS-I - LISS-II*
LISS-II*
(LISS II has two cameras A and B)
Swath (Km) 2 x 74 2 x 74
148 148
Resolution (m) 72.5 36.25

Table-10 Orbital characteristics of IRS-1C / IRS-1D

Features IRS-1C IRS-1D


Orbit type Polar Sun synchronous Polar Sun synchronous
Altitude 817 km 780 km (mean)
Inclination 98.69o 98.53
Distance between adjacent traces 117.5 km 111.94
Repetivity for LISS-3 24 days 25 days
Repetivity for WIFS 5 days 3 days
Revisit for PAN 5 days 3 days
Off-nadir coverage ± 26o for PAN 398 km. 407 km
Stereo viewing capability 5 days 3 days

Table-11 Characteristics of IRS series satellites

Swath
Satellite No. of Resolution Revisit
Launch Sensors Types Width
Name Bands (meters) Time
(km)
5 November
Mangalyaan ( Mars
2013 / 24
Orbiter Mission)
September 2014
Argos
SARAL 25 February 013 Altimeters
Altika
C
RISAT - 1 26 April 2012 SAR Active – Radar 1-50 m 25 days
Band
MADRAS
Microwave
Megatropiques 12 October 2011 SAPHIR
radiometer
ScaRaB

22
ROSA

AWiFS Multispectral 4 56 740 5 days

RESOURCESAT 2 20 April 2011 4 23.5 141


LISS-III Multispectral 24 days

LISS-IV Multispectral 3 5.8 70 24 days

CARTOSAT –2B 12 July 2010 PAN PAN 1 1 9.6 5 days


IRS 2B (Oceansat 2) 24 Sept, 2009 OCM Multispectral 236m 1440
2 days
SCAT 1400
RISAT-2 20 April 2009 SAR-X 3-8 m 10 km, 50
km
[Max
Swath:
650 km]
IRS P8 (CartoSat- 28 April 2008 PAN PAN 1 0.8 16
2A)
IMS -1 28 April 2008 MX Multispectral 4 37 151
HySI Hyperspectral 64 505.6 125
Imager
CARTOSAT – 1 5 May 2005 PAN PAN 1 2.5 30 5 days

AWiFS Multispectral 4 56 740 5 days

IRS – P6
17 Oct, 2003 4 23 142
(Resourcesat 1) LISS-III Multispectral 24 days

23.9 MX
mode
LISS-IV Multispectral 3 5.8 24 days
70 PAN
mode
OCM Multispectral 8 360 m 1420 km
IRS-P4 (Oceansat) 26 May, 1999 120, 80, 40 2 days
MSMR RADAR 4 1360 km
and 40 kms

WiFS Multispectral 2 189 774 5 day


September -
IRS-1D
1997 24-25
LISS-III Multispectral 3 23 142
days

23
1 70 148
PAN PAN 1 6 70

WiFS Multispectral 2 189 810 5 day

IRS-1C 1995 3 23.6 142


LISS-III Multispectral 24-25
days
1 70.8 148
PAN PAN 1 5.8 70

LISS-I Multispectral 4 72.5 148


IRS-1B 1991 22 days

LISS-II Multispectral 4 36.25 74


LISS-I Multispectral 4 72.5 148
IRS-1A 1988
LISS-II Multispectral 4 36.25 74

Sensors
(i) Linear Imaging Self Scanning (LISS) Camera used in IRS-1A & B
Indian Remote Sensing Satellite (IRS-1A) fully designed and fabricated by the Indian
Space Research Organization (ISRO) was launched on 17th March 1988 by Russian
launcher. It has four spectral bands in the range of 0.45 to 0.86 m (0.45 to 0.53 m to
0.59 m, 0.62 to 0.68 m and 0.77 to 0.86 m) in the visible and near infrared range
with two different spatial resolution of 72.5 m. and 36.25 m. from one no. of open LISS-
1 and two nos. of LISS-2 cameras respectively. It provides repetitive coverage after
every 22 days. Like all other LANDSAT/SPOT missions which are designed for global
coverage IRS is also in sun synchronous, polar orbit at about 900 km altitude and cover a
width of 148 km. on ground. It uses linear array detectors (CCD) like SPOT.
(ii) Linear Imaging Self Scanning-3 Camera (LISS-3)
This camera is configured to provide imageries in three visible bands as well as in
shortwave infrared band. The resolution and swath for visible bands are 23.5 m and 142
km, respectively. The detector has a 6000-element CCD based linear array with a pixel
dimension of 10µm by 7 µm. The detector is placed at the focus of a refractive type
optical system consisting of eight lens elements, which provides a focal length of 360
mm.
The processing of the analogue output video signal is similar to that of PAN. For this
camera, a 7-bit digitization is used which gives an intensity variation of 128 levels.

24
Table - 12 Characteristics of LISS-3
Band 2 0.52-0.59 µm
Band 3 0.62-0.68 µm
Band 4 0.77-0.86 µm
Band 5 1.55-1.70 µm
Geometric resolution 23.5 m for bands 2,3,5
70.5 m for band 5
Equivalent focal length (bands2, 3,4/band 5) 347.5 mm/301.2 mm
Swath 141 km for bands 2,3,4
148 km for band 5
Radiometric resolution 7 bits
10 bits in Resourcesat 2
Band-to-band registration ±0.25 pixel

(iii) Panchromatic camera (PAN)


The PAN camera is configured to provide the imageries of the earth in visible spectrum,
in a panchromatic band (0.5-0.75 µm) with a geometric resolution of greater than 10 m
and a swath of 70 km. The camera uses an off-axis reflective type optics system
consisting of three mirrors for providing the required focal length. A 7-µm pixel sized
CCD is being used as the detector element.
The total swath of 70 km. is covered by using three linear array charge-coupled
detectors and each of these detectors covers a swath of about 24 km. The central
detector is offset from the other two detectors by a distance in focal plane, which
corresponds to 8.6 km on the ground. The other two detectors cover swath of 24 km
each adjacent to the central CCD. These two detectors are aligned with an accuracy of
30-arc sec-1. The overlap of the central swath with the side swaths is 600 m on the
ground. Each of the detectors provides four analogue outputs, which are independently
processed by video chains, converted to digital and provided a data handling system for
formatting. For a PAN data compatible with the expected signal to noise ratio, a 6-bit
digitization is used which gives 64 radiometric gray levels.

The PAN payload with its capability to tilt ±26o, can view (revisit) any particular scene
once in 5 days, if required. Additionally this provision can be used for getting stereo
pairs or imageries. The tilting capability is achieved by steering the camera as a whole
by the required angle using a steering mechanism to which PAN camera lugs are fixed.
Table - 13 Characteristics of PAN camera

Geometric resolution from altitude of 817 km 5.8 m


Effective focal length for optics 980 mm
Swath 70 km
Field-of-view for optics ±2.5o (across track)

25
±0.3o (along track)
Spectral band 0.5-0.75 µm

(iv) Wide Field Sensor (WiFS)


This camera operates in two bands B3:0. 62 µm to 0.68 µm (Red) and B4:0. 77 µm to
0.86 µm (NIR). Each band uses a 2048 element CCD with an element size of 13 µm by 13
µm. A wide-angle refractive optics system with 8-lens elements is used with a focal
length of about 56 mm. This payload covers a ground swath of 770 km with a resolution
of 188 m. This ground swath with the selected 817 km orbit can provide the required
repetivity for the intended application.
To cover the 770 km, swath two separate band assemblies are used for each band. Thus
the entire swath in each band is covered by two detectors. Each of the detectors covers
half of the swath. The signal processing chain in similar to LISS-3 wherein the analogue
video signal is converted to 7 bits and given to data handling system for formatting.
Table - 14 Characteristics of WiFS

Band 3 0.62-0.68 µm
Band 4 0.77-0.86 µm
Resolution 188.3 m
Swath 810 km
Radiometric resolution 7 bits
Band-to-band registration ±0.25 pixel

(v) High Resolution Linear Imaging Self-Scanner (LISS-IV)

LISS-IV sensor onboard IRS – P6 operates in three spectral bands in the visible and near-
infrared (VNIR) or PAN mode with 5.8 meter spatial resolution.
The LISS-IV sensor can be operated in either of two modes ( In IRSP6):
 In multi-spectral mode (Mx), LISS-IV covers a swath of 23 km (selectable out of
70 km total swath) in all three bands.
 In mono mode (Mono), the full swath of 70 km will be covered in any one single
band selectable by ground command (nominally in B3, red band).

The changes in RESOURCESAT-2 compared to RESOURCESAT-1 are: Enhancement of


LISS-4 multispectral swath from 23 km to 70 km and improved Radiometric accuracy
from 7 bits to 10 bits for LISS-3 and LISS-4 and 10 bits to 12 bits for AWIFS.

Table – 15 Characteristics of LISS –IV

26
IGFOV 5.8 m at nadir
Spectral B2: 0.52-0.59
Bands B3: 0.62-0.68
B4: 0.77-0.86
Swath 23.9 km (multispectral mode in P6 and 70 Km in Resourcesat2)
Integration 0.877714 msec
Time
Quantization 10 bits

No. of gains Single gain (Dynamic range obtained by sliding 7 bits out f of 10 bits)

(v) Advanced Wide Field Sensor (AWiFS)


The Advanced Wide Field Sensor (AWiFS) with twin cameras has a 56 meter NADIR
resolution with a 700 km combined swath and a five day revisit time. To cover such a
wide swath, the AWiFS camera is split into two separate electro-optic modules (AWiFS-A
and AWiFS-B) tilted by 11.94 degrees with respect to each other.

Table - 16 Characteristics of AWiFS

IGFOV 56m (nadir)


70m (at field edge)
Spectral Bands B2: 0.52-0.59
B3: 0.62-0.68
B4: 0.77-0.86
B5: 1.55-1.70
Swath 370 km each head
740 km (combined)
Integration Time 9.96 msec
Quantization 10 bits in P6
12 bits in Resourcesat 2
No. of gains Single gain

(vi) Ocean color monitor (OCM)


OCM is a solid state camera operating in eight narrow spectral bands. The camera is
used to collect data on chlorophyll concentration, detect and monitor phytoplankton
blooms and obtain data on atmospheric aerosols and suspended sediments in the
water.

(vii) Multi-frequency Scanning Microwave Radiometers (MSMR)

27
MSMR, which operates in four microwave frequencies both in vertical and horizontal
polarisation is used to collect data on sea surface temperature, wind speed, cloud water
content and water vapor content in the atmosphere above the ocean.

High Spatial Resolution satellites


IKONOS
The IKONOS-2 satellite was launched in September 1999 and has been delivering
commercial data since early 2000. IKONOS is the first of the next generation of high
spatial resolution satellites. IKONOS data records 4 channels of multispectral data at 4
m resolution and one panchromatic channel with 1 m resolution. This means that
IKONOS is first commercial satellite to deliver near photographic quality imagery of
anywhere in the world from space.

Table-17 Spectral Characteristic of IKONOS data

Band Wavelength Resolution


Panchromatic 0.45 - 0.90 um (Visible) 1 m
1 0.45 - 0.52 um (Blue) 4m
2 0.52 - 0.60 um (Green) 4m
3 0.63 - 0.69 um (Red) 4m
4 0.76 - 0.90 um (Near 4 m
IR)

Radiometric Resolution: Data is collected as 11 bits per pixel (2048 gray tones).
Timings of collecting / receiving IKONOS data and satellite orbit characteristics vary
considerably depending on accuracy of product, extent and area. The applications for
this data are boundless: in particular, it will be used for large scale mapping, creating
precise height models for e.g. micro-cellular radio, and for every application requiring
the utmost detail from areas which are inaccessible for aerial photography.

Meteorological Satellites
Designed specifically to assist in weather prediction and monitoring, meteorological
satellites, or meteosats, generally incorporate sensors that have very coarse spatial
resolution compared to land oriented systems. On the other hand, meteosats afford the
advantages of global coverage at very high temporal resolution. Accordingly, meteosat

28
data have been shown to be useful in natural resource applications where frequent,
large area mapping is required and fine detail is not. Apart from the advantage of
depicting large areas at high temporal resolution, the coarse spatial resolution of
meteosats also greatly reduces the volume of data to be processed for a particular
application.
Numerous countries have launched various types of meteosats with a range of orbit and
sensing system designs e.g. NOAA series (operated by U.S. named after the National
Oceanic and Atmospheric Administration). These have near-polar, sun-synchronous
orbits. In contrast GOES and INSAT series satellites are in geo-stationary orbits. India has
launched INSAT series satellites, which are telecommunication, and meteorological
satellites.
INSAT Series
INSAT satellites are basically communication satellites used for telecommunication and
broadcasting, which carried meteorological sensor for weather monitoring. These
satellites are used in day-to-day weather forecasting, cyclone monitoring etc. The
sensor is Very High Resolution Radiometer (VHRR). Among this series, the most
powerful satellite is INSAT-1C, launched from French Guyana on December 1995
weighing 2070 kg in a geo-stationary orbit. This satellite has heralded a new era in
telecommunication by introducing mobile phones. The details are given below.
Table-19 Orbital characteristics of INSAT series satellites

Altitude 36000 km
Nature Geostationary
Repetitive coverage 3 hr.
Sensor VHRR
Resolution 2.75 km
Spectral bands 0.55 - 0.75 µm
10.5 - 12.5 µm.

FUTURE INDIAN SATELLITE MISSIONS


Encouraged by the successful operation of the present IRS missions, many more
missions have been planned for realization in the next few years. These missions will
have suitable sensors for applications in cartography, crop and vegetation monitoring,
oceanography and atmospheric studies.
RISAT – 1
Radar Imaging Satellite (RISAT) is a microwave remote sensing satellite carrying a
Synthetic Aperture Radar (SAR). The satellite weighing around 1850 kg is in the final
stages of development for launch by PSLV-XL during third quarter of 2010 into a 536 km
orbit with 25 days repetitivity with an added advantage of 12 days inner cycle for Coarse
Resolution ScanSAR mode.

29
Megha-Tropiques
ISRO and French National Space Centre (CNES) signed a Memorandum of Understanding
(MOU) in 2004-05 for the development and implementation of Megha-Tropiques
(Megha meaning cloud in Sanskrit and Tropiques meaning tropics in French). The launch
of Megha-Tropiques is planned during the fourth quarter of 2010.
Megha-Tropiques is aimed at understanding the life cycle of convective systems and to
their role in the associated energy and moisture budget of the atmosphere in the
tropical regions. The satellite will carry an Imaging Radiometer Microwave Analysis and
Detection of Rain and Atmospheric Structures (MADRAS), a six channel Humidity
Sounder (SAPHIR), a four channel Scanner for Radiation Budget Measurement (SCARAB)
and GPS Radio Occultation System (GPS-ROS).

SARAL
The Satellite for ARGOS and ALTIKA (SARAL) is a joint ISRO-CNES mission and planned to
be launched during 2011. The Ka band altimeter, ALTIKA, provided by CNES payload
consists of a Ka-band radar altimeter, operating at 35.75 GHz. A dual frequency total
power type microwave radiometer (23.8 and 37 GHz) is embedded in the altimeter to
correct tropospheric effects on the altimeter measurement. Doppler Orbitography and
Radio-positioning Integrated by Satellite (DORIS) on board enables precise
determination of the orbit. A Laser Retroreflector Array (LRA) helps to calibrate the
precise orbit determination system and the altimeter system several times throughout
the mission.

ASTROSAT
ASTROSAT is a first dedicated Indian Astronomy satellite mission, which will enable
multi-wavelength observations of the celestial bodies and cosmic sources in X-ray and
UV spectral bands simultaneously. The scientific payloads cover the Visible (3500-6000
Å), UV (1300-3000 Å), soft and hard X-ray regimes (0.5-8 keV; 3-80 keV). The uniqueness
of ASTROSAT lies in its wide spectral coverage extending over visible, UV, soft and hard
X-ray regions.

GROUND RECEIVING STATION PRODUCTS

The GRS products can either be standard or value added/special products. Standard
products are generated after applying radiometric and Geometric corrections.
Special/value added products are generated after further processing the standard
products by mosaicing/merging/extracting/enhancement of data.

The raw data recorded at the earth station is corrected to various levels of processing at
the Data processing systems

 Level 0 Uncorrected (raw data)

30
 Level 1 Radio metrically corrected and Geometrically corrected only for
earth rotation (Browse product)
 Level 2 Both radio metrically corrected and Geometrically corrected
(Standard product)
 Level 3 Special processing like merging, enhancement etc. after level 2
corrections (Special product)
 Precision Product
 Value added product e.g. vegetation index map. Digital terrain model

Radiometric distortions
 Non Uniform Response of the Detectors
 Specific Detector element Failure
 Data loss during data communication or Archival/Retrieval
 Narrow dynamic range
 Image to Image Variations

Geometric distortions
 Scene related
 Sensor related
 Space Craft related
 Multi Image Mosaicing
 Map Projection
 Geocoded Correction – True North Rotation

Data dissemination
The data are recorded on Digital Linear Tapes (DLTs) or CD-ROMs, DVDs depending on
the mission and archived for providing data products to users as and when orders are
received.
Satellite data products are available on photographic and digital media. Photographic
products can be supplied as films or prints. Digital products can be supplied in form of
CD-ROMs, DVD or it can be downloaded through FTP services also.
Generally, single band data is provided in B/W such as PAN data or one band data from
multi-spectral sensors. Similarly, photographic, color products called as False Color
Composites (FCC) can be provided for multi-spectral data. The output scale for prints
can vary from 1:1 M to 1:5000.

Digital Data Product formats


Digital Data are supplied in the following formats
 LGSOWG Superstructure Format (all satellites/sensors except NOAA,AQUA and
TERRA)
 Fast Format (all satellites/sensors except NOAA, AQUA and TERRA)

31
 GeoTIFF- Gray Scale (from IRS-1C onwards except NOAA, AQUA and TERRA)
 GeoTIFF - RGB single band FCC or NCC(from IRS-1C onwards except NOAA, AQUA
and TERRA)
 HDF (AQUA and TERRA and OCEANSAT-2)
The digital data format document is provided along with the digital data.

Satellite Referencing Scheme:


Referencing scheme is unique for each satellite mission, is a means of conveniently
identifying the geographic location of points on the earth. This scheme is designated by
Path and Rows. The Path-Row concept is based on the nominal orbital characteristics.

PATH
An orbit is the course of motion taken by the satellite in space and the ground trace of
the orbit is called a 'Path'.
e.g. IRS IC (source NRSC)
In a 24 day cycle, the satellite completes 341 orbits with an orbital period of 101.35
minutes. This way, the satellite completes approximately 14 orbits per day. Though the
number of orbits and paths are the same, the designated path number in the
referencing scheme and the orbit number are not the same.

On day one (D1), the satellite covers orbit numbers 1 to 14, which as per the referencing
scheme will be path numbers 1, 318, 294, 270, 246, 222, 198, 174, 150, 126, 102, 78, 54
and 30, assuming that the cycle starts with path 1.

So orbit 1 corresponds to path 1, orbit 2 to path 318, orbit 3 to path 294 etc. Path
number one is assigned to the track which is at 29.7 deg West longitude. The gap
between successive path is 1.055 deg. All subsequent orbits fall westward. Due to the
limitation of antenna drive speed it is difficult to track the satellite around zenith
because above 86 deg elevation, if a pass occurs, the data may be lost for a few
seconds. To reduce it to minimum, path 1 is positioned in such a manner that the data
reception station is exactly between two nominal paths, namely 99 and 100.

ROW
The lines joining the corresponding scene centers of different paths are parallel to the
equator and are called ‘Rows’.
Along a path, the continuous stream of data is segmented into a number of scenes
which are framed in such a manner that its centre lies on the equator which is taken as
the reference line for segmentation.
e.g. LISS-III (source NRSC)
LISS III scene, consisting of 6000 lines, is framed such that the centre of the scene lies on
the equator. The next scene is defined such that its centre lies exactly 5,703 lines from

32
the equator. The center of next scene is then defined 5,703 lines northwards and so on.
This is continued upto 81 deg North latitude. The uniformly separated scene centers are
such that same rows of different paths fall at the same latitude. The row number 1 falls
around 81 deg North latitude, row number 41 will be near 40 deg North and row
number of the scene lying on the equator is 75. The Indian region is covered by row
numbers 30 to 90 and path numbers 65 to 130.

Scene Definition
The camera scans the ground track line by line continuously. The satellite motion along
the track provides continuous imaging of the ground. This continuous stream of data is
segmented to convenient sizes. These segments are called scenes.

Use of Referencing Scheme


The Path-Row referencing scheme eliminates the usage of latitude and longitudes and
facilitates convenient and unique identification of a geographic location. It is useful in
preparing accession and product catalogues. The actual scene may be displaced slightly
from the nominal scene defined in the referencing scheme due to orbit and attitude
variations during operation. If the user's area of interest lies in border/overlapping
region of any scene, the user may have to order the overlapping scenes in addition to
the normal scene.

References:
1. Campbell John B. 1996 : Introduction to Remote Sensing. Taylor & Francis
2. Curran P.J., 1985. Principles of Remote Sensing. Longman Group Limited,
London. 282 pp.
3. Elachi C., 1987. Introduction to the Physics and Techniques of Remote Sensing.
Wiley Series in Remote Sensing, New York, 412 pp.
4. Floyd F. Sabins : Remote Sensing and Principles and Image Interpretation
5. George Joseph : Imaging Sensors, 1996 Remote Sensing Reviews
6. Lillesand Thomas M. & Kiefer Ralph 1993 : Remote Sensing and Image
Interpretation Third Edition John Villey
7. Manual of Remote Sensing IIIrd Edition : American Society of Photogrammtery
and Remote Sensing
8. http://www.ccrs.nrcan.gc.ca/ccrs/learn/tutorials/fundam/chapter1/chapter1_2.
9. www.planetary.brown.edu/arc/sensor.html
10. http://www.ersc.edu/resources/EOSC.html
11. www.spaceimage.com
12. www.eospso.gfc.nasa.gov
13. www.landsat.org
14. www.spotimage.fr/home
15. www.space.gc.ca
16. www.esa.int/export/esasa/ESADTOMBAMC_earth_O.html

33
INTRODUCTION TO IMAGE INTERPRETATION

Aerial photographs as well as imagery, obtained by remote sensing using aircraft or


spacecraft as platforms, have applicability in various fields. By studying the qualitative as
well as quantitative aspects of images recorded by various sensor systems, like aerial
photographs (black-and-white, black-and-white infrared, colour and colour infrared),
multiband photographs, satellite data (both pictorial and digital) including thermal and
radar imagery, an interpreter well experienced in his field can derive lot of information.

Image Interpretation

Image interpretation is defined as the act of examining images to identify objects


and judge their significance. An interpreter studies remotely sensed data and attempts
through logical process to detect, identify, measure and evaluate the significance of
environmental and cultural objects, patterns and spatial relationships. It is an information
extraction process.

Anyone who looks at a photograph or an imagery in order to recognize an image is


an interpreter. A soil scientist, a geologist or a hydrogeologist, a forester or a planner,
trained in image interpretation can recognize the vertical view presented by the ground
objects on an aerial photograph or a satellite image, which enables him or her to detect
many small or subtle features that an amateur would either overlook or mis-interpret. An
interpreter is, therefore, a specialist trained in the study of photography or imagery, in
addition to his or her own discipline. The present discussion mainly pertains to the
techniques of visual interpretation, the application of various instruments and the
extraction of information.

Aerial photographs, as well as imagery, obtained by remote sensing employing


electromagnetic energy as the means of detecting and measuring target/objects
characteristics, has applicability to various fields because of four basic reasons.

First - It represents a larger area of the earth from a perspective view and provides a
format that facilitates the study of objects and their relationships.

Second - Certain types of imagery and aerial photograph can provide a 3-D view.
Third - Characteristics of objects not visible to the human eye can be transformed
into images
Fourth - It provides the observer with a permanent record/representation of
objects at any moment of time. In addition, data is real-time,
repetitive and, when in digital form, is computer compatible for quick
analysis.

1
BASIC PRINCIPLES OF IMAGE INTERPRETATION

Images and their interpretability

 An image taken from the air or space is a pictorial presentation of the pattern of a
landscape.
 The pattern is composed of indicators of objects and events that relate to the physical,
biological and cultural components of the landscape.
 Similar conditions, in similar circumstances and surroundings, reflect similar patterns,
and unlike conditions reflect unlike patterns.
 The type and amount of information that can be extracted is proportional to the
knowledge, skill and experience of the analyst, the methods used for interpretation
and the analyst's awareness of any limitations.

Factors Governing the Quality of an image

In addition to the inherent characteristics of an object itself, the following factors


influence image quality:

 Sensor characteristics (film types, digital systems)


 Season of the year and time of day
 Atmospheric effects
 Resolution of the imaging system and scale
 Image motion
 Stereoscopic parallax

Factors Governing Interpretability

1. Visual and mental acquity of the interpreter


2. Equipment and technique of interpretation
3. Interpretation keys, guides, manuals and other aids.

Visibility of Objects

The objects on aerial photographs or imagery are represented in the form of photo
images in tones of grey in B/W photography and in colour/false colour photography in
different colours/hues. This visibility of objects in the images varies due to -

a) The inherent characteristics of the objects


b) The quality of the aerial photography or imagery.

2
Inherent Characteristics of Objects

In any photographic image forming process, the negative is composed of tiny silver
deposits formed by the action of light on photosensitive film during exposure. The amount
of light received by the various sections of the film depends on the reflection of
electromagnetic radiation (EMR) from various objects. This light, after passing through the
optical system, gives rise to different tones and textures.

In visual interpretation, an interpreter is primarily concerned with recognizing


changes in tonal values, thereby differentiating an object of a certain reflective
characteristic from another. However, he must be aware that the same object under
different moisture or illumination conditions, and depending on the wavelength of incident
energy, may reflect a different amount of light. For this reason, a general key, based on
tone characteristics of objects, cannot be prepared. In such cases, other characteristics of
objects such as their shape, size and pattern etc. help in their recognition.

Quality of Aerial Photography/Imagery:


The quality of image interpretation depends on the quality of the basic material on
which the images are formed. Normally, in visual interpretation, these images are formed
on the photograph and represented in tones of grey or in colours of various hues, chroma
and values. A study of the factors, affecting image quality and characteristics of images, is
essential from an interpreter's point of view.

The Tonal or Colour Contrast Between an Image and Its Background

Photographic tone contrast is the difference in brightness between an image and its
background. Similarly, in colour photography colour contrast is the result of all hue values
and chroma differences between the image and its background. The tonal contrast can be
sufficiently increased with proper filters.

Image Sharpness Characteristics

Sharpness is the abruptness with which tone or colour contrasts appear on the
photograph or imagery. Both tone and sharpness enable an interpreter to distinguish one
object from another. To a large extent, image sharpness is dependent on the focussing
ability of the optical system. Image sharpness is closely related to the resolution of the
optical system.

3
Stereoscopic Parallax Characteristics

Stereoscopic parallax is the displacement of the apparent position of an image with


respect to a reference point of observation. Sufficient parallax is necessary in order to
distinguish objects from their shadows. Parallax depends on the height of an object, flying
height and the stereobase or its corollary, the forward overlap. Stereoscopic parallax can
be improved by choosing the right base/height (B/H) ratio.

The above investigation appears to be over simplified as a number of other factors


can be mentioned which obviously effect the image quality. However, for the purpose of
simplification, we may conclude that other factors influence image quality indirectly
through their effect on tone, sharpness or parallax.
In general, if image motion and exposure times were no problem, we would
obviously use fine grain, high definition, slow photographic material, with an appropriate
filter in order to get better sharpness and contrast.

ELEMENTS OF IMAGE INTERPRETATION

Image interpretation is essential


for the efficient and effective use of the
data. While the above properties of
aerial photographs/imagery help an
interpreter to detect objects due to their
tonal variations, he must also take
advantage of other important
characteristics of the objects in order to
recognize them. The following elements
of image interpretation shown in Figure
are regarded as being of general
significance, irrespective of the precise
nature of the imagery and the features it
portrays...

Shape
Numerous components of the environment can be identified with reasonable
certainty merely by their shape. This is true of both natural features and man-made
objects.

4
Size
In many cases, the length, breadth, height, area and/or volume of an object can be
significant, whether these are surface features (e.g. different tree species) or atmospheric
phenomena (e.g. cumulus versus cumulonimbus clouds). The approximate size of many
objects can be judged by comparisons with familiar features(e.g. roads) in the same scene.

Tone

We have seen how different objects emit or reflect different wavelengths and
intensities of radiant energy. Such differences may be recorded as variations of picture
tone, colour or density. which enable discrimination of many spatial variables, for
example, on land different crop types or at sea water bodies of contrasting depths or
temperatures. The terms 'light', 'medium' or 'dark' are used to describe variations in tone.

Shadow

Hidden profiles may be revealed in silhouette (e.g. the shapes of buildings or the
forms of field boundaries). Shadows are especially useful in geomorphological studies
where micro relief features may be easier to detect under conditions of low-angle solar
illumination than when the sun is high in the sky. Unfortunately, deep shadows in areas of

5
complex detail may obscure significant features, e.g. the volume and distribution of traffic
on a city street.

Pattern

Repetitive patterns of both natural and cultural features are quite common, which
is fortunate because much image interpretation is aimed at the mapping and analysis of
relatively complex features rather than the more basic units of which they may be
composed. Such features include agricultural complexes (e.g. farms and orchards) and
terrain features (e.g. alluvial river valleys and coastal plains).

Texture

Texture is an important image characteristic


closely associated with tone in the sense that it is a
quality that permits two areas of the same overall tone
to be differentiated on the basis of microtonal
patterns. Common image textures include smooth,
rippled, mottled, lineated and irregular.
Unfortunately, texture analysis tends to be rather
subjective, since different interpreters may use the
same terms in slightly different ways. Texture is rarely

6
the only criterion of identification or correlation employed in interpretation. More often it
is invoked as the basis for a subdivision of categories already established using more
fundamental criteria. For example, two rock units may have the same tone but different
textures.

Site

At an advanced stage in image interpretation, the


location of an object with respect to terrain features of other
objects may be helpful in refining the identification and
classification of certain picture contents. For example, some tree
species are found more commonly in one topographic situation
than in others, while in industrial areas the association of several
clustered, identifiable structures may help us determine the
precise nature of the local enterprise. For example, the
combination of one or two tall chimneys, a large central building, conveyors, cooling
towers and solid fuel piles point to the correct identification of a thermal power station.

Resolution

Resolution of a sensor system may be defined as its capability to discriminate two


closely spaced objects from each other. More than most other picture characteristics,
resolution depends on aspects of the remote sensing system itself, including its nature,
design and performance, as well as the ambient conditions during the sensing programme
and subsequent processing of the acquired data. An interpreter must have a knowledge
about the resolution of various remote sensing data products.

Stereo-scopic Appearance

When the same feature is photographed from two different positions with overlap
between successive images, an apparently solid model of the feature can be seen under a
stereoscope. Such a model is termed a stereomodel and the three-dimentional view it

7
provides can aid interpretation. This valuable information cannot be obtained from a single
print.

In practice, these nine elements assure a variety of ranks of importance.


Consequently, the order in which they may be examined varies from one type of study to
another. Sometimes they can lead to assessment of conditions not directly visible in the
images, in addition to the identification of features or conditions that are explicitly
revealed. The process, by which related invisible conditions are established by inference, is
termed "convergence of evidence". It is useful, for example, in assessing the social class
and/or income group occupying a particular neighbourhood or the soil moisture conditions
in agricultural areas.

Image interpretation may be very general in its approach and objective, such as in
the case of terrain evaluation or land classification. On other occasions it is highly specific,
related to clear-cut goals in such fields as geology, forestry, transport studies and soil
erosion mapping. In no instance should the interpreter fail to take into account features
other than those for which he or she is specifically searching. Failure to give adequate
consideration to all aspects of a terrain is, perhaps, the commonest source of
interpretation error.

The interpretation of images is therefore an essentially deductive process, and the


identification of certain key features leads to the recognition of others. Once a suitable
starting point has been selected, the elements listed earlier are considered either
consciously or subconsciously. The completeness and accuracy of the results depends on
an interpreter's ability to integrate such elements in the most appropriate way to achieve
the objectives that have been set for him or her.

TECHNIQUES OF IMAGE INTERPRETATION

The development of interpretation techniques has been mainly by the empirical


method. The gap between the photo image on the one hand and the reference level, i.e.
the level of knowledge in a specific field, in the human mind on the other hand, is bridged
by the use of image-interpretation. The techniques adopted for one discipline may differ
from those adopted for another. The sequence of activity and the search method may
have to be modified to suit the specific requirements.

Image interpretation comprises at least three mental acts that may or may not be
performed simultaneously:

i) The measurement of images of objects


ii) Identification of the objects imaged
iii) Appropriate use of this information in the solution of the problem.

8
In visual interpretation, the methodology of interpretation for each separate
discipline will depend on :

 Kind of information to be interpreted


 Accuracy of the results to be obtained
 The reference level of the person executing the interpretation
 Kind and type of imagery or photographs available
 Instruments available
 Scale and other requirements of the final map
 External knowledge available and any other sensory surveys that have been or
will be made in the near future in the same area.

From the scrutiny of the above list, it is evident that no stereotyped approach can
be prescribed for the techniques or the methodology of photo-interpretation. An
interpreter must work out the plan of operations and the techniques depending on the
project's special requirements.

In carrying out this task, an interpreter may use many more types of data than
those recorded on the images he is to interpret. Many sources, such as literature,
laboratory measurements, analysis, field work and ground and aerial photographs (or
imagery) make up this collateral material.

Activities of Image-interpretation

Image-interpretation is a complex process comprising physical as well as mental


activities. This means familiarity with so wide a variety of stimuli that the even most
accomplished interpreter is occasionally dependent on reference materials.

The reference material in the form of identification keys is a useful aid in image
interpretation. Many types of image interpretation keys are available or may be
constructed depending on the abilities of the interpreter and the purpose to be served by
the interpretation.

METHODS OF SEARCH AND SEQUENCE OF INTERPRETATION

In visual interpretation and whenever possible, especially when examining vertical


or nearly vertical photographs, the scene is viewed stereoscopically. The sequence begins
with the detection and identification of objects followed by measurements of the image.
The image is then considered in terms of information, usually non-pictorial, and finally
deductions are made. The interpreter should work methodically, proceeding from general
considerations to specific details and from known to unknown features.

There are two basic methods that may be used to study aerial imagery:

9
"Fishing expedition" - an examination of each and every object so as not to miss anything,

"Logical search" - quick scanning and selective intensive study.

Sequence of Activities

Normally the activities in an image-interpretation sequence include the following:


Detection

Detection means selectively picking out an object or element of importance for the
particular kind of interpretation in hand. It is often coupled with recognition, in which case
the object is not only seen but also recognized.

Recognition and Identification

Recognition and identification together are sometimes termed photo-reading.


However, they are fundamentally the same process and refer to the process of
classification of an object by means of specific or local knowledge within a known category
upon an object's detection in a photo-image.

Analysis

Analysis is the process of separating or delineating a set of similar objects. In


analysis, boundary lines are drawn separating the groups, and the degree of reliability of
these lines may be indicated.

Deduction

Deduction may be directed to the separation of different groups of objects or


elements and the deduction of their significance based on converging evidence. The
evidence is derived mainly from visible objects or from invisible elements, which give only
partial information on the nature of certain correlative indications.

Classification

Classification establishes the identity of a surface or an object delineated by


analysis. It includes modification of the surface into a pertinent system for use in field
investigation. Classification is made in order to group surfaces or objects according to those
aspects that, for a certain point of view, bring out their most characteristic aspects.

Idealization

10
Idealization refers to the process of drawing or standardized representations of
what is actually seen in the photo image. This process is helpful for the subsequent use of
photograph/imagery during field investigations and in the preparation of base maps.

These processes would be better explained by taking an example. If investigations


of dwellings are to be carried out, the first step would be to detect photo images having
rectangular shape etc. The next step would be to recognize, say, a single storey
construction and a double storey construction. Delineation of the two groups of objects
would be done under the process of analysis in which, a boundary line may be drawn
separating the two groups. At this stage, in view of various converging evidence, it may be
deduced that one group is a single storey dwelling. In more difficult cases this would be
done in the process of classification and a code number appointed to the groups to help
field examinations. Cartographic representation would be made under the process of
idealization.

Convergence of evidence:

Image interpretation is basically a deductive process. Features that can be


recognized and identified directly lead the image interpreter to the identification and
location of other features. Even though all aspects of an area are irreversibly interwined,
the interpreter must begin some place, he can not consider drainage, landform, vegetation,
and manmade features simultaneously. He should begin with one feature or group of
features and then on to the others, integrating each of the facets of the terrain as he goes.
For each terrain, the interpreter must find his own point of beginning and then consider
each of the various aspects of the terrain in logical fashion. Deductive image interpretation
requires conscious or unconscious consideration of the elements of image interpretation
listed earlier. The completeness and accuracy of image interpretation are proportional to
the interpreter's understanding of how and why images show shape, size, tone, shadow,
pattern, and texture, while an understanding of site, association, and resolution
strengthens the interpreter's ability to integrate the different features making up a terrain.
For the beginners, systematic consideration of the elements of image interpretation
should precede integrated terrain interpretation.

The principle of convergence of evidence requires the interpreter first to recognize


basic features or types of features and then to consider their arrangement (pattern) in the
a real context. Several interpretations may suggest themselves. Critical examination of the
evidence usually shows that all interpretations but one are unlikely or impossible. The
greatest difficulty in interpreting images involves judging degrees of probability.

Sensors in Photographic Image Interpretation

As stated earlier, characteristics not visible to the human eye can also be recorded
and displayed by using proper sensor types. Digital data can also be transferred onto any

11
type of film, depending on the type of study to be carried out. Normally, the four types of
films are used for visual data display as follows.

a) Black-and-white panchromatic,
b) Black-and-white infrared
c) Colour,
d) Colour infrared/false colour

All of the above types are available in different grades and sensitivities that can be
preselected for a particular use. An interpreter must know the characteristics of each of
these before starting an interpretation job. The same is true for the digital data display for
multispectral, thermal and radar imagery.

INSTRUMENTS FOR VISUAL INTERPRETATION AND TRANSFER OF DATA

Interpretation Instruments
Monocular instruments: magnifiers
Stereoscopic instruments: mirror and pocket stereoscope
interpretoscope
zoom stereoscope
scanning mirror
stereoscope

Instruments for Transfer of Data


For flat terrain: Sketchmaster
Stereosketchmaster
Zoom transferscope
Optical pantograph or reflecting projector
For hilly terrain: Stereoplotters
Orthophoto together with its stereo-mate, can be used for
interpretation and delineation’s. Since preparation of
orthophoto and its stereo-mate is a complex process, the
method is not so popular.
Conclusion

The scope of image-interpretation as a tool for analysis and data collection is


widening with the advance of remote sensing techniques. Space images have already found
their use in interpretation for the earth sciences. Because of the flexibility of its techniques
and substantial gains in accuracy, speed and economy over conventional ground methods,
the future of image-interpretation is assured. However, great endeavor is required on the
part of the interpreter to assess his or her own empirical knowledge in order to formulate
the optimum data requirements for different disciplines. This is essential for the better

12
development of image-interpretation and for widening the scope of application of its
techniques.

Spectral Signature of Land Cover Features

As tone EMR incidents on earth's surface, behaviour of land in a particular locality is


mainly due to the component of land exposed at the surface at that locality. The
component may be vegetation canopy, water body, barren rock, lose soil, built-up or
mixture of these. Since each of these exhibit typical spectral signature influenced by so
many other parameters of their own, they are to be considered separately to understand
the nature of EMR interaction with each component. Spectral signature of water,
vegetation and soil are discussed in detail in the following sections.

Spectral Reflectance and Spectral Signature of Soil

The majority of the flux incident on a soil surface is reflected or absorbed and little
is transmitted. The reflectance properties of the majority of soils are similar, with a
positive relationship between reflectance and wavelengths, as can be seen in Fig.1. The
five characteristics of a soil that determine its reflectance properties are, in order of
importance: its moisture content, organic content, texture, structure and iron oxide
content. These factors are all interrelated, for example the texture (the proportion of
sand, silt and clay particles) is related to both the structure (the arrangement of sand, silt
and clay particles into aggregates) and the ability of the soil to hold moisture.

Effect of soil texture, structure and soil moisture

The relationship between texture, structure and soil moisture can best be described
with reference to two contrasting soil types. A clay soil tends to have a strong structure,
which leads to a rough surface on ploughing; clay soils also tend to have high moisture
content and as a result have a fairly low diffuse reflectance. In contrast, a sandy soil tends
to have a weak structure, which leads to a fairly smooth surface on ploughing; sandy soils
also tend to have a low moisture content and a result have fairly high and often specular
reflectance properties. In visible wavelengths the presence of soil moisture considerably
reduces the surface reflectance of soil. This occurs until the soil is saturated; at which
point further additions of moisture has no effect of reflectance.

Reflectance in near and middle infrared wavelengths is also negatively related to


soil moisture. An increase in soil moisture will result in a rapid decrease in reflectance in
water (H2O) and hydroxyl (H2O) absorbing wavebands that absorb at wavelengths centered
at approximately 0.9 m, 1.9 m, 2.2 m and 2.7 m. The effect of water and hydroxyl
absorption is more noticeable in clay soils for these soils have much bound water and very
strong hydroxyl absorption properties, as can be seen in Fig.2.

13
The surface roughness (determined by the texture and structure) and the moisture
content of soil also affect the way in which the reflected visible and near infrared radiation
is polarized. This is because when polarized sunlight is specularly reflected from a smooth
wet surface it becomes weakly polarized to a degree that is positively related to the
smoothness and the wetness of that surface. This effect has been used to estimate soil
surface moisture from aircraft-borne sensors at altitudes of up to 300 meters.

Organic matter

Soil organic matter is dark and its presence decreases the reflectance from the soil
up to an organic matter content of around 4-5 percent. When the organic matter content
of the soil is greater than 5 percent, the soil is black and any further increases in organic
matter will have little effect on reflectance.

Iron Oxide

Iron oxide gives many soils their 'rusty' red coloration by coating or stating
individual soil particles. Iron oxide selectively reflects red light (0.6-0.7 m). This effect is
so marked that workers have been able to use a ratio of red to green bi-directional
reflectance to locate iron ore deposits from satellite altitudes.

Spectral Reflectance and Spectral Signature of Water

The majority of radiant flux incident upon water is either not reflected but is either
absorbed or transmitted. In visible wavelengths of EMR, little light is absorbed, a small
amount, usually below 5% is reflected and the rest is transmitted. Water absorbs NIR and
MIR strongly, (Fig.3) leaving little radiation to be either reflected or transmitted. This
results in sharp contrast between any water and land boundaries.

The factors, which govern the variability in reflectance of a water body, are the
depth of the water, suspended material within the water and surface roughness of the
water.
In shallow water some of the radiation is reflected not by the water itself but from
the bottom of the water body. Therefore, in shallow pools and streams it is often the
underlying material that determines the water body's reflectance properties and colour in
the FCC.

Among the suspended materials the most common materials are non-organic
sediments, tannin and chlorophyll. The effect of non-organic silts and clays increase the
scatter and the reflectance, in visible wavelengths.

Water bodies that contain chlorophyll have reflectance properties that resemble, at
least in part, those of vegetation with increased green and decreased blue and decreased

14
red reflectance. However, chlorophyll content must be very high enough to detect these
changes.

The roughness of water surface can also affect its reflectance properties. If the
surface is smooth then light is reflected specularly from surface, giving very high or very
low reflectance, dependent upon the location of the sensor. If the surface is very rough
then there will be increased scattering at the surface, which in turn will increase the
reflectance.

Spectral Reflectance and Spectral Signature of Vegetation

The spectral reflectance of vegetation over EMR spectrum depends upon

1. Pigmentation
2. Physiological structure
3. Leaf moisture content

The hemispherical reflectance of any individual leaf is insufficient to describe the


remotely sensed bi-directional reflectance of a vegetation canopy. This is because a
vegetation canopy is not a large leaf but is composed of a mosaic of leaves, other plant
structures, background and shadow. Hence spectral reflectance of vegetation canopy
could vary appreciably due to the effect of the soil background, the presence of senescent
vegetation, the angular elevation of Sun and sensor, the canopy geometry and certain
episodic and phenological canopy changes. Among these some are considered for
discussion here.

Effect of Pigmentation absorption

The primary pigments are chlorophyll a, chlorophyll b, B carotene and xantophyll,


all of which absorb visible light for photosynthesis. Chlorophyll a and chlorophyll b, which
are more important pigments, absorb portions of blue and red light; chlorophyll a absorbs
at wavelengths of 0.43 m and chlorophyll b at wavelengths of 0.45 m and 0.65 m. The
carotenoid pigments, carotene and xantophyll, both absorb blue to green light.

Physiological structure and reflectance in NIR

The discontinuities in the refractive indices within a leaf determine its near
reflectance. These discontinuities occur between membranes and cytoplasm within the
upper half of the leaf and more importantly between individual cells and air spaces of the
spongy mesophyll within the lower half of the leaf.

The combined effects of leaf pigments and physiological structure give all healthy
green leaves their characteristic reflectance properties: low reflectance of red and blue
light, medium reflectance of green light and high reflectance of near infrared radiation (Fig

15
4). The major difference in leaf reflectance between species, are dependent upon leaf
thickness, which affects both pigment content and physiological structure. For example, a
thick wheat flat leaf will tend to transmit little and absorb much radiation whereas a flimsy
lettuce leaf will transmit much and absorb little radiation (Fig 5).

Effect of Leaf moisture

Leaf reflectance is reduced as a result of absorption by three major water


absorption bands that occur near wavelengths of 1.4 m, 1.9 m and 2.7 m and two
minor water absorption bands that occur near wavelengths of 0.96 m, and 1.1 m (Fig. 6).
The reflectance of the leaf within these water absorption bands is negatively related to
both the amount of water in the leaf and the thickness of the leaf. However, water in the
atmosphere also absorbs radiation in these water absorption bands and therefore the
majority of sensors are limited to three 'atmospheric windows' that are free of water
absorption at wavelengths of 0.3 to 1.3 m; 1.5 to 1.8 m; and 2.0 to 2.6 m. Fortunately
within these wavebands, electromagnetic radiation is still sensitive to leaf moisture.

The effect of the soil background

The bi-directional reflectance of the soil has a considerable effect on bi-directional


reflectance of the vegetation canopy. The soil/waveband combinations that are unsuitable
for the remote sensing of vegetation can be identified. For example, on dark toned soils
with low red bi-directional reflectance there is little change in the red bi-directional
reflectance of the canopy with an increase in the canopy LAI as the leaves have similar
reflectance properties to the soil. On a light toned soil with a high bi-directional
reflectance, the relationship between near infrared bi-directional reflectance and LAI is
weaker than on a dark soil, as on a dark soil the contrast between leaves and soil is high in
near infrared wavelengths.

The effect of vegetation senescence

Vegetation senesces due to aging and the crop begins to ripen, the near infrared
reflectance of the leaf does not significantly decrease. However, the breakdown of the
plant pigments, result in a rise in the reflectance of blue and red wavelengths. As a result
there is a positive relationship between bi-directional reflectance, at each wavelength, and
the LAI of senescent vegetation.

The effect of canopy geometry

The geometry of a vegetation canopy will determine the amount of shadow seen by
the sensor and will therefore influence the sensitivity of bi-directional reflectance
measurements to angular variation in sun and sensor. For example, the reflectance of a
rough tree canopy unlike a smoother grassland canopy is greatly dependent upon the solar
angle.

16
The effect of phenology

The seasonal change has influence in canopy bi-directional reflectance. From


quantitative studies it is known that for a non-deciduous canopy (e.g. grassland) red bi-
directional reflectance is maximised in autumn and minimised in spring, and near infrared
bi-directional reflectance is maximised in the summer and minimised in the winter. These
relationships can be presented as hysteresis loops of bi-directional reflectance. Each
hysteresis plot contains the expected pattern, with minor variations for the vegetation of
the nature reserve and corn crop and major variations for the wheat and rice crop. The
wheat crop has a lower than expected red bi-directional reflectance in the summer;
probably due to high productivity and a higher than expected near infrared bi-directional
reflectance in autumn, probably as a result of senescent stubble left in the fields. Irrigation
status as well as Leaf Area Index (LAI) of the crop determines the bi-directional reflectance
of rice crop; for example, in the summer the wet soil background reduces the otherwise
high near infrared bi-directional reflectance of the crop.

Figure 1: Spectral Reflectance curve of soil

17
Figure 2: Effect of Soil Moisture on Soil Spectral Reflectance.

Figure 3: Absorption of electromagnetic radiation by seawater

18
Figure 4 : Spectral reflectance of leaf (top)

Figure 5 : The reflectance, absorbance, transmittance properties of wheat


and lettuce leaves

19
Figure 6 Effect of Leaf moisture on spectral reflectance

IMAGE INTERPRETATION FOR MULTISPECTRAL SCANNER IMAGERY

Introduction

The application of MSS image interpretation has been demonstrated in many fields,
such as agriculture, botany, cartography, civil engineering, environmental monitoring,
forestry, geography, geophysics, land resource analysis, land use planning, oceanography,
and water resource analysis.

LANDSAT MSS Image Interpretation

As shown in Table 1, the image scale and area covered per frame are very different
for Landsat images than for conventional aerial photographs. For example, more than 1600
aerial photographs at a scale of 1:20,000 with no overlap are required to cover the area of
a single Landsat MSS image! Because of scale and resolution differences, Landsat images
should be considered as a complementary interpretive tool instead of a replacement for
low altitude aerial photographs. For example, the existence and/or significance of certain
geologic features trending of tens or hundreds of kilometers, and clearly evident on a
Landsat image, might escape notice on low altitude aerial photographs. On the other
hand, housing quality studies from aerial imagery would certainly be more effective using
low altitude aerial photographs rather than Landsat images, since individual houses cannot

20
be resolved on Landsat MSS images. In addition, most Landsat MSS images can only be
studies in two dimensions, whereas most aerial photographs are acquired in stereo.

Table 1 Comparison of Image Characteristics

Image Format Image Scale Area Covered per Frame (km2)


Low altitude USDA-ASCS aerial 1:20,000 21
photographs (230 X 230 mm)
High altitude NASA aerial 1:120,000 760
photographs (RB-57 or ER-2) (230
X 230 mm)
Landsat scene (185 X 185 mm) 1:1,000,000 34,000

Resolution

The effective resolution (in terms of the smallest adjacent ground features that can
be distinguished from each other) of Landsat MSS images is about 79 m (about 30 m on
Landsat-3 RBV images). However, linear features as narrow as a few meters, having a
reflectance that contrasts sharply with that of their surroundings, can often be seen on
Landsat images (for example, two-land roads, concrete bridges crossing water bodies, etc.).
On the other hand, objects much larger than 79 m across may not be apparent if they have
a very low reflectance contrast with their surroundings, and features detected in one band
may not be detected in another.

Stereoscopic ability

As a line scanning system, the Landsat MSS produces images having one
dimensional relief displacement. Because there is displacement only in the scan direction
and not in the flight track direction, Landsat images can be viewed in stereo only in areas of
side lap on adjacent orbit passes. This side lap varies from about 85 percent near the poles
to about 14 percent at the equator. Consequently, only a limited area of the globe may be
viewed in stereo. Also, the vertical exaggeration when viewing MSS images in stereo in
quite small compared to conventional air photos. This systems from the extreme platform
altitude (900 km) of the satellite compared to the base distance between images. Whereas
stereo airphotos may have a 4X vertical exaggeration, stereo Landsat vertical exaggeration
ranges from about 1.3X at the equator to less than 0.4X at latitudes above about 70o.
Subtle as this stereo effect is, geologists in particular have found stereoviewing in Landsat
overlap areas quite valuable in studying topographic expression. However, most
interpretations of Landsat imagery are made monoscopically, either because sidelapping
imagery does not exist or because the relief displacement needed for stereoviewing is so
small. In fact, because of the high altitude and narrow field of view of the MSS, images
from the scanner contain little or no relief displacement in nonmountainous areas. When
such images are properly processed, they can be used as planimetric maps at scales as

21
large as 1:250,000. Recently all these difficulties has been overcome in Panchromatic of
SPOT and IRS-1C imagery.

Individual Band Interpretation

The most appropriate band or combination of bands of MSS imagery should be


selected for each interpretive use. Band 41 (green) and 5(red) are usually best for detecting
cultural features such as urban areas, roads, new subdivisions, gravel pits, and quarries. In
such areas, band 5 is generally preferable because the better atmospheric penetration of
red wavelengths provides a higher contrast image. In areas of deep, clear water, greater
water penetration is achieved in band 4. Bands 4 and 5 are excellent for showing silty
water flowing into clear water. Bands 6 and 7 (near infrared) are best for delineating water
bodies. Since energy of near-infrared wavelengths penetrates only a short distance into
water, where it is absorbed with very little reflection, surface water features have a very
dark tone in bands 6 and 7. Wetlands with standing water or wet organic soil where little
vegetation has yet emerged also have a dark tone in bands 6 and 7, as do asphalt-surfaced
pavements and wet bare soil areas. Both bands 5 and 7 are valuable in geologic studies,
the largest single use of Landsat MSS data.

In the comparative appearance of the four Landsat MSS band, the extent of the
urban areas is best seen in bands 4 and 5 (light toned). The major roads are best seen in
band 5 (light toned), clearly visible in band 4, undetectable in band 6, and slightly visible in
band 7 (dark toned). An airport concrete runway and taxiway are clearly visible. The
concrement pavement is clearly visible in bands 4 and 5 (light toned), very faint in band 6
(light toned),and undetectable in band 7. The asphalt pavements is very faint in bands 4
and 5 (light toned), reasonably clear in band 6 (dark toned), and best seen in band 7 (dark
toned). The major lakes and connecting river are best seen in bands 6 and 7 (dark toned).
These lakes have a natural green colour in mid-July resulting from the presence of algae in
the water. In the band 4 image, all lakes have a tone similar to the surrounding agricultural
land, which consists principally of green-leafed crops such as corn. The lakes mostly
surrounded by urban development, and therefore, their shorelines can be reasonably well
detected. The lakes principally surrounded by agricultural land and their shorelines are
often indistinct. The shorelines are more distinct in band 5, but still somewhat difficult to
delineate. The surface water of major lakes and the connecting river is clearly seen in both
bands 6 and 7 (dark toned). The agricultural use have a rectangular field pattern with
different tones representing different crops. This is best seen in bands 5, 6 and 7. For
purposes of crop identification and mapping from MSS images, the most effective
procedure is to view two or more bands simultaneously in an additive colour viewer or to
interpret color composite images. Small forested areas appear dark-toned in bands 4 and
5. In regions receiving a winter snowfall, forested areas can best be mapped using
wintertime images where the ground is snow covered. On such images, the forested and
shrub land areas will appear dark toned against a background of light-toned snow.

22
Temporal data

As each Landsat satellite passes over the same area on the earth's surface during
daylight hours about 20 times per year. The actual number of times per year a given
ground area is imaged depends on amount of cloud cover, sun angle, and whether or not
the satellite is in operation on any specific pass. This provides the opportunity for many
areas to have Landsat images available for several dates per year. Because the appearance
of the ground in many areas with climatic change is dramatically different in different
seasons, the image interpretation process is often improved by utilizing images from two
or more dates.

Band 5 imaged in September and December, in some areas the ground is snow
covered (about 200 mm deep) in the December image and all water bodies are frozen,
except for a small stretch of the river in northern hemisphere. The physiography of the
area can be better appreciated by viewing the December image, due in part to the low
solar elevation angle in winter that accentuates subtle relief. The snow-covered upland
areas and valley floors have a very light tone, whereas the steep, tree-covered valley sides
have a darker tone. The identification of urban, agricultural, and water areas can better be
accomplished using the September image. The identification of forested areas can be
more positively done using the December image.

Synoptic view

The synoptic view afforded by space platforms can be particularly useful for
observing short-lived phenomena. However, the use of Landsat images to capture such
ephemeral events as floods, forest fires, and volcanic activity is, to some degree, a hit-or-
miss proposition. If a satellite passes over such an event on a clear day when the imaging
system is in operation, excellent images of such events can be obtained. On the other
hand, such events can easily be missed if there are no images obtained within the duration
of the event or, as is often true during floods, extensive cloud cover obscures the earth's
surface. However, some of these events do leave lingering traces. For example, soil is
typically wet in a flooded area for at least several days after the flood waters have receded,
and this condition may be imaged even if the flood waters are not there. Also, the area
burned by a forest fire will have a dark image tone for a considerable period of time after
the actual fire has ceased.

In the red band image, the vast quantities of silt flowing from the river into the
delta can be clearly seen. However, it is difficult to delineate the boundary between land
water in the delta area. In the near-infrared band image, the silt-laden water cannot be
distinguished from the clear water because of the lack of water penetration of near-
infrared wavelengths. However, the delineation of the boundary between land and water
is much clearer than in red band.

23
The black tone of the burned area contrasts sharply with the lighter tones of the
surroundings unburned forest area.

Tropical deforestation in response to intense population pressures. An extensive


area of forest land being cleared for transmigration site development. The dark toned area
shows forested land. Areas being actively cleared are principally are light-toned "fingers"
cutting into the forested land are cleared areas. The indistinct lighter toned plumes from
the nearly cleared areas are smoke plumes from burning debris.

False Colour Composite (FCC)

Bands 4,5, and 7 are combined in this fashion to produce the color image . Spectral
characteristics and color signatures of Landsat MSS color images are comparable to those
of IR color aerial photographs. Typical signatures are as follows :

Healthy vegetation Red


Clear water Dark blue to black
Silty water Light blue
Red beds Yellow
Bare soil, fallow fields Blue
Windblown sand White to yellow
Cities Blue
Clouds and snow White
Shadows Black

Land-Use and Land-Cover Interpretation on FCC

Urban areas has a grid pattern of major traffic arteries. Central commercial areas
have blue signatures caused by pavement, roofs, and an absence of vegetation. The
suburbs are pink to red, depending on density and condition of lawns, trees, and other
landscape vegetation. Small, bright red areas are parks, golf courses, cemeteries, and
other concentrations of vegetation.

Agriculture and vegetation has a rectangular bright red (growing crops) and blue-
grey (fallow fields) pattern. Red circles formed by alfalfa fields irrigated by centerpoint
irrigation sprinklers.

Rangeland has a red-brown signature in the fall season image. Forest and brush
cover mountainous terrain and the Transverse Ranges: lower elevation are covered by
chaparral and higher elevations by pine trees are also red-brown.

24
Water is represented by the ocean and scattered reservoirs. The dark blue color is
typical of the ocean much of the year, but during the winter rainy season, muddy water
from various rivers forms light-colored plumes that are carried.

The desert have a light yellow signature that is ;typical of arid land. Valley are
several light gray to very dark gray triangles, which are alluvial fans of gravel eroded from
the bedrock of the Transverse Ranges. Dry lakes have white signatures caused by silt and
clay deposit.

Major geologic features are also recognizable in the Landsat image. The fault,
which separates the valley from the Transverse Ranges, is expressed as linear scarps and
canyons.

Return-Beam Vidicon System

Return-beam vidicons (RBV) are framing systems that are essentially television
cameras. Landsat 1 and 2 carried three RBVs that recorded green, red and photographic IR
images of the same area on the ground. These images can be projected in blue, green, and
red to produce infrared color images comparable to MSS images. There were problems
with the color RBV system, and the images were inferior to MSS images; for these reasons,
only a few color RBV images were acquired. Landsat 3 deployed an extensively modified
version of RBV.

Typical RBV Images

In typical RBV images the array of small crosses, called reseau marks, are used for
geometric control. The 1:10,00,000 scale is the same as that of the MSS image to which
these RBV frames may be compared. This comparison illustrates the advantages of the
higher spatial resolution of RBV. For example, in the urban area the grid of secondary
streets is recognizable on the RBV image but not on the MSS.

Landsat 3 collected RBV images of many areas around the world. Where RBV and
MSS images are available, it is useful to obtain both data sets in order to have the
advantages of higher spatial resolution (from RBV) plus IR color spectral information (from
MSS).

LANDSAT TM Image Interpretation

Landsat TM images are useful for image interpretation for a much wider range of
applications than Landsat MSS images. This is because the TM has both an increase in the
number of spectral bands and an improvement in spatial resolution as compared with the
MSS. The MSS images are most useful for large area analyses, such as geologic mapping.
More specific mapping, such as detailed land cover mapping, is difficult on MSS images
because so many pixels of the original data are "mixed pixels," pixels containing more than

25
one cover type. With the decreased IFOV of the TM data, the area containing mixed pixels
is smaller and interpretation accuracies are increased. The TM's improved spectral and
radiometric resolution also aid image interpretation. In particular, the incorporation of the
mid-IR bands (bands 5 and 7) has greatly increased the vegetation discrimination of TM
data.

The dramatic improvement in resolution from the MSS's ground resolution cell of
79 x 79 m to the TM's ground resolution cell of 30 x 30 m. Many indistinct light-toned
patches on the MSS image can be clearly seen as recent suburban development on the TM
image. Also, features such as agricultural field patterns that are indistinct on the MSS
image can be clearly seen on the TM image.

TM has more narrowly defined wavelength ranges for the three TM bands roughly
comparably to MSS bands 1 to 4 and has added bands in four wavelength ranges not
covered by the MSS bands.
Table 2 Thematic-mapper spectral bands

Band Wavelength, um Characteristics


1 0.45 to 0.52 Blue-green - no MSS equivalent. Maximum penetration of
water, which is useful for bathymetric mapping in
shallow water. Useful for distinguishing soil from
vegetation and deciduous from coniferous plants
2 0.52 to 0.60 Green - coincident with MSS band 4. Matches green
reflectance peak of vegetation, which is useful for
assessing plant vigor.
3 0.63 to 0.69 Red - coincident with MSS band 5. Matches a
chlorophyll absorption band that is important or
discriminating vegetation type.
4 0.76 to 0.90 Reflected IR - coincident with portions of MSS bands 6
and 7. Useful for determining biomass content and for
mapping shorelines.
5 1.55 to 1.75 Reflected IR. Indicates moisture content of soil and
vegetation. Penetrates thin clouds. Good contrast
between vegetation types.
6 10.40 to 12.50 Thermal IR. Nighttime images are useful for
thermal mapping and for estimating soil moisture
7 2.08 to 2.35 Reflected IR. Coincides with an absorption band caused
by hydroxyl ions in minerals. Ratios of bands 5 and 7 are
potentially useful for mapping hydrothermally altered
rock associated with mineral deposits.

26
Sensing energy emitted form objects at ambient earth temperatures within the 8 to
14 um wavelength range. When objects are extremely hot, such as flowing lava, emitted
energy can be sensed in wavelengths shorter than thermal infrared wavelengths ( 3 to 14
æm). Forest fires are another example of an extremely hot phenomenon that can be
sensed in wavelengths shorter than thermal infrared.

Image Mapping

Thematic Mapper data have been used extensively to prepare image maps over a
range of mapping scales. Such maps have proven to be useful tools for resource
assessment in that they depict the terrain in actual detail, rather than in the line-and-
symbol format of conventional maps. Image maps are often used as map supplements to
augment conventional map coverage and to provide coverage of unmapped areas.

As we can see, there are several digital image processing procedures that may be
applied to the image mapping process. These include such things as large area digital
Mosaicing, image enhancement procedures, merging of image data with conventional
cartographic information, and streamlining the map production and printing process using
highly automated cartographic systems. Extensive research continues in the area of image
mapping with both Landsat, SPOT, and IRS data in which push broom scanners has been
deployed. The stereo/coverage with desired B/H ratio is also possible. Resolution has also
improved to 20m and 10m in SPOT while 23.2m and 5.8 in IRS-1C.

SPOT HRV & IRS Image Interpretation

The use of SPOT data for various interpretive purposes is facilitated by the system's
combination of multispectral sensing with excellent spatial resolution, geometric fidelity,
and the provision for multidate and stereo imaging.

Merging Data

An increase in the apparent resolution of SPOT & IRS multispectral images can be
achieved through the merger of multispectral and panchromatic data. 20-m-resolution
multispectral image of an agricultural area and a 10-m-resolution merged multispectral and
panchromatic image in case of SPOT while 23.6m MSS and 5.8 m Pan of IRS-1C. The
merged image maintains the colors of the multispectral image but has a resolution
equivalent to that of the panchromatic image. Both the spatial and spectral resolution of
the merged image approach that seen in small scale, high altitude, color infrared aerial
photographs.

Using the parallax resulting when SPOT & IRS-1C data are acquired from two
different orbit tracks, perspective views of a scene can be calculated and displayed.

27
Perspective views can also be produced by processing data from a single image with digital
elevation data of the same scene.

Analysis of MSS Images

MSS images are interpreted in much the same manner as small-scale photographs
or images and photographs acquired from manned satellites. However, there are some
differences and potential advantages of MSS images. Linear features caused by
topography may be enhanced or suppressed on MSS images depending on orientation of
the features relative to sun azimuth. Linear features trending normal, or at a high angle, to
the sun azimuth are enhanced by shadows and highlights. Those trending parallel with the
azimuth are suppressed and difficult to recognize, as are linear features parallel with the
MSS scan lines.

Scratches and other film defects may be mistaken for natural features, but these
defects are identified by determining whether the questionable features appear on more
than a single band of imagery. Shadows of aircraft contrails may be mistaken for tonal
linear features but are recognized by checking for the parallel white image of the contrail.
Many questionable features are explained by examining several images acquired at
different dates. With experience, an interpreter learns to recognize linear features of
cultural origin, such as roads and field boundaries.

The recommended interpretation procedure in geology is to plot lineaments as


dotted lines on the interpretation map. Field checking and reference to existing maps will
identify some lineaments as faults; for these the dots are connected by solid lines on the
interpretation map. The remaining dotted lines may represent (1) previously unrecognized
faults, (2) zones of fracturing with no displacement, or (3) lineaments unrelated to geologic
structure.

The repeated coverage of landsat enables interpreters to select images from the
optimum season for their purpose. Winter images provide minimum sun elevations and
maximum enhancement of suitably oriented topographic features are commonly enhanced
on images of snow-covered terrain because the snow eliminates or suppresses tonal
differences and minor terrain features, such as small lakes. Areas with wet and dry
seasonal climates should be interpreted from images acquired at the different seasons. In
cloud-free rainy-season images are best for most applications, but this selection may not
apply everywhere.

Significance of colors on Landsat IR color images was described earlier in the section on
MSS images. For special interpretation objectives, black-and-white images of individual
bands are useful. Table 2&4 gives some specific applications of TM & IRS bands.

28
Points to remember

1. Cloud-free MSS images are available for most of the world with no political or security
restrictions.
2. The low to intermediate sun angle enhances many subtle geologic features.
3. Long-term repetitive coverage provide images at different seasons and illumination
conditions.
4. The images are low in cost.
5. IR color composites are available for many of the scenes. With suitable equipment,
color composites may be made for any image.
6. Synoptic coverage of each scene under uniform illumination aids recognition of major
features. Mosaics extend this coverage.
7. There is negligible image distortion.
8. Images are available in a digital format suitable for computer processing.
9. Limited stereo coverage is available except SPOT and IRS-1C.
10. TM provides images with improved spatial resolution, extended spectral range, and
additional spectral bands.

In addition to the applications shown in this chapter, Landsat images are valuable
for resource exploration, environmental monitoring, land-use analysis, and evaluating
natural hazards.

Another major contribution of Landsat is the impetus it has given to digital image
processing. The availability of low-cost multispectral image data in digital form has
encouraged the application and development of computer methods for image processing,
which are increasing the usefulness of the data for interpreters in many disciplines.

Since the first launch in 1972, Landsat has evolved from an experiment into an
operational system. There has been a steady improvements in the quality and utility of the
image data. Many users throughout the world now rely on Landsat, SPOT and IRS images
as routinely as they do on weather and communication satellites. It is essential that the all
remote sensing programs continue to provide images.

References:
1. Campbell John B. 1996 : Introduction to Remote Sensing. Taylor & Francis
2. Curran P.J., 1985. Principles of Remote Sensing. Longman Group Limited, London.
282 pp.
3. Floyd F. Sabins : Remote Sensing and Principles and Image Interpretation
4. Lillesand Thomas M. & Kiefer Ralph 1993 : Remote Sensing and Image
Interpretation Third Edition John Villey
5. Eugene & Avery Interpretation of Aerial Photographs
6. http://www.ccrs.nrcan.gc.ca/ccrs/learn/tutorials/.

29
Scope of Remote Sensing:
A basic idea of using remote sensing is to extend
the human visibility to capture beyond reach
information.

Human brain fails to remember every minute


detail of daily happenings/events surrounding
him. Remote sensing can enhance our memory
serving the best possible.

With remote sensing images, we can measure and


map spatial dimensions of objects.
3

Furthermore, we use remotely sensed data to monitor


the dynamics of the phenomena on the Earth surface.
These include monitoring the vigor and stress of
vegetation and environmental quality, measuring the
temperature of various objects, detecting and
identifying catastrophic sites caused by fire, flood,
volcano, earthquakes etc., estimating the mass of
various components, such as biogeochemical
constituents of a forest, volume of fish schools in
water, crop production of agricultural systems, water
storage and runoff of watersheds, population in rural
and urbanized areas, and quantity and living
conditions of wildlife species.
4

1
Remote sensing is one of a suite of tools available to
land managers that provides up-to-date, detailed
information about land condition.

Remote sensing uses instruments mounted on


satellites or in planes to produce images or 'scenes'
of the Earth's surface.

Satellite retrieved images can be used in many


applications, both in land and ocean resources
inventorying, mapping, monitoring and management
of the resources.

The uniqueness of satellite remote sensing lies in its


ability to show large land areas and to detect features
at electromagnetic wavelengths which are not visible
to the human eye.

Data from satellite images can show larger areas


than aerial survey data and, as a satellite regularly
passes over the same plot of land capturing new data
each time, changes in the land use and condition can
be routinely monitored. 6

2
In the Land Monitor project, satellite images are
being used to provide information on land condition
and the changes in that condition through time,
specifically salinity and the status of remnant
vegetation, to help farmers, environmental managers
and planners better manage the land.

One of the outcomes of the Land Monitor project


will be an archive of satellite images of the south-
west agricultural region.

To get additional information about land condition,


the satellite images are combined with other data
such as air photos; digital elevation maps (DEMs)
and ground data. 7

Farmers, landcare workers and field officers, with


their detailed knowledge of the vegetation and soils
in their own paddocks or regions, can extract
information on productivity from simple displays of
the satellite images.

The information from remotely sensed images can be


used in a number of ways for a number of purposes.

It is usually combined with information from other


data sources (ancillary information), and with
information from on-the-ground observations (in situ
measurements), called 'ground truth', to get a more
complete picture of what is happening and to check
suspected features or changes. 8

3
Remote Sensing System:
Principally remote sensing is used for
inventorying, mapping and monitoring of the
Earth resources and the health of the environment.

Earth Observation Satellites continuously observe


the Earth from space, and the acquired data are
provided as satellite images and are used to study
environmental problems, to monitor disasters and
to explore resources.

The information, so generated, is extremely


important for the resources managers and policy
makers. 9

The major elements of such a system described by


Lillesand and Kiefer (1979) are:
(a) Data acquisition and
(b) Data processing and analysis
( See Figure 1).

10

4
•Data acquisition process involves the following:

•A source of energy
The first requirement for remote sensing is to have
an energy source which illuminates or provides
electromagnetic energy to the target of interest.

Energy in the form of electromagnetic radiation


Solar radiation and terrestrial radiation that
carries information about the target of interest.

11

•Propagation of radiation through the Earth's


atmosphere
As the energy travels from its source to the target;
it will come in contact with and interact with the
atmosphere it passes through. This interaction
may take place a second time as the energy travels
from the target to the sensor.

•Interaction of radiation with matter


Once the energy makes its way to the target
through the atmosphere, it interacts with the target
depending on the properties of both the target and
the radiation. 12

5
•Sensors
After the energy has been scattered by, or emitted
from the target, we require a sensor (remote: not
in contact with the target) to collect and record the
electromagnetic radiation - active or passive.

•Platforms - airborne or space borne.


Recording of sensor signals either in pictorial
form or electronically in numerical form or
magnetic tapes.

13

•Transmission of data to ground-based stations


The energy recorded by the sensor has to be
transmitted, often in electronic form, to a
receiving and processing station where the data
are processed into an image (hardcopy and/or
digital).

14

6
•Data processing and analysis involves:

•Conversion of electronic data into pictorial form or computer


compatible tapes.
•Acquisition of ground truth data comprising ancillary
information and spectral signature to serve as reference data
•Interpretation of the data using interpretation devices, aids or
computers, making using of the reference data to extract the
information about the target which was illuminated.
•Ground checks or evaluation of the data.
•Generating information products for the users in the form of
maps, tables, pictures and reports to aid them in their decision-
making process in the management of various Earth resources.
Examples are land use maps, crop inventories, etc.
15

7
Remote Sensing

By:

Tailor Ravin M.
Assistant Prof, CED

Contents:
• Overview of Remote Sensing
• Electromagnetic Energy, Photons, and the
Spectrum
• Visible Wavelengths
• Infrared Sensing

1
OVERVIEW OF REMOTE SENSING

• We perceive our
surrounding world through
our five senses
• Sight and hearing do not
require close contact
between sensors and
externals
• Thus, our eyes and ears
are remote sensors
• We perform remote
sensing essentially all of (Virtual Science Centre)
the time
3

OVERVIEW OF REMOTE SENSING

Remote Sensing from Afar


• Remote sensing implies
that a sensor not in direct
contact with objects or
events being observed
• Information needs a carrier
– Electromagnetic
radiation is normally
used as information
carrier
• The output of a remote (Virtual Science Centre)
sensing system is usually
an image representing the 4
observed scene

2
OVERVIEW OF REMOTE SENSING

Remote Sensing Platforms of the Earth


• Airborne platforms:
– Aircraft
– Balloons
• Spaceborne platforms:
– Satellites
– The Space Shuttle

(Virtual Science Centre)


5

What is remote sensing?


The International Society for Photogrammetry and
Remote Sensing (ISPRS) defined Remote Sensing
(RS) as:
“The art, science, and technology of obtaining reliable
information about physical objects and the
environment, through the process of recording,
measuring, and interpreting imagery and digital
representation of energy patterns derived from non
contact sensor system " . This definition considered
photogrammetry as sub-field of remote sensing
– via cameras recording on film, which may then be scanned (aerial photos)
– via sensors, which directly output digital data (satellite imagery)

3
Key Milestones in Remote Sensing
1826 – Joseph Niepce takes first photograph
1858 – Gaspard Tournachon takes first aerial photograph from
a balloon
1913 – First aerial photograph collected from an airplane
1935 – Radar invented
1942 – Kodak patents color infrared film
1950s – First airborne thermal scanner
1957 – First high resolution synthetic aperture radar
1962 – Corona satellite series (camera systems) initiated by the
Intelligence community
1962 – First airborne multispectral scanner
1972 – ERTS-1 Launched – First Landsat satellite
7

Early photograph by J. Niepce


circa1830

4
Nadir in his
balloon

Nadir photograph of Paris

10

5
Balloon Photo
of Boston 1836

11

Thaddeus Lowe’s Civil War Balloons


U.S.Army of the Potomac 1861-1865
Massachusetts’ man,
Professor and
visionary, Lowe
Observatory/Calif.

Platform: Balloon
Sensor:
Telescope
Data System: 12
Telegraph

6
Thaddeus Lowe,
circa 1861-1865
remote sensing for
military purposes.
Then, as now, newest
developments are
always in the military
sphere
13

Remote
sensing early
in the
airplane era

14

7
U-2 Spy Plane 1954-1960
Flew at 70,000’ over USSR air defenses

15

SR-71 Blackbird super-sonic spy plane

16

8
CIA’s Corona Program
1960-1972 >100 missions
Followed after U-2s…
Platform:
Spacecraft
Sensor:
Camera
Data System: Film
Drop
Spatial Resolution: early missions Started:
@ 13 m, later missions
August @
2m
1960
17
Spectral Resolution: visible and visible-near infrared (both
film) Coverage: 7.6 Bil

CIA’s Corona Program


Washington Monument 1967

18

9
Ikonos 1 m panchromatic imagery
2000

19

MODIS Land Reflectance and


Sea Surface Temperature

20

10
Remote Sensing Organizations
• ISPRS- International Society for Photogrammetry and Remote
Sensing
• IGARSS- International Geosciences And Remote Sensing
Symposium
• NASA -National Aeronautic and Space Administration (USA)
• ESA- European Space Agency (Europe)
• NASDA- National Space Development Agency (Japan)
• CNES- Centre National d'Etudes Spatiales (France )
• DARA- German Space Agency
• CSA - Canadian Space Agency
• NRSA- National Remote Sensing Agency of India

21

22

11
23

24

12
Remote Sensing
• Advantages
• Disadvantages

25

OVERVIEW OF REMOTE SENSING

Remote Sensing from Space


• Information pertains to all areas of interest, such as:
– Land
– Oceans
– Atmosphere
• Some practical applications are:
– Weather observing
– Mapping and cataloging
– Early warning
– Media coverage
– Extensions of astronomical capabilities, such as:
• Earthbound telescopes
• Spacecraft carrying visible light sensors
• Addition of radio wave, infrared, ultraviolet, x-ray, and gamma ray
sensors 26

13
Energy Interactions with Earth Surface Features
• Solar radiation is
electromagnetic
energy reflected or
scattered from the
Earth
• Different materials
(water, soil, etc.)
reflect energy in
different ways
– Each material has its
own spectral
reflectance signature 27
(Virtual Science Centre)

Electromagnetic Energy
• Electromagnetic energy can be though of as either waves or
particles, known as photons.
• This energy is propagates through space in form of periodic or
sinusoidal disturbances of electric and magnetic fields
– In free space this is 299,792,458 meters/second (exact)
• The waves are characterized by frequency and wavelength, related
by:

c = νλ
where
– c = speed of light
– ν = frequency
– λ = wavelength,
usually in μm (10-6 meters),
or in nm (10-9 meters)
28
(Wave Nature of Light)

14
The Electromagnetic Spectrum

(The Wave Nature of Light)


29

Multispectral Images
Red
• One band at a time displayed as gray
scale image
• Combination of three bands for color
composite image.
• Requires knowledge of spectral
reflectance for composite image
interpretation.
Near-IR Green

(Virtual Science
Centre) 30

15
False Color Composite

• Common false color scheme for


SPOT:
R = NIR band
G = red band
B = green band

(Virtual Science Centre)

31

Four Views of Crab Nebula from


Different Multispectral Sensing
Devices

X-ray (rst) Optical

32
Infrared Radio

16
Electromagnetic Energy
• A photon is quantized energy, or an energy packet
• Photons can have different discrete energy values
• The energy of a quantum is given by Planck's equation:
hc
E = hν =
λ
where
– E = energy of a quantum, in Joules, J
– h = Planck’ s constant, 6.626 • 10 - 34 J • sec
– ν = frequency, in Hertz, or cycles/sec
– λ = wavelength, in meters

• Thus photons of shorter wavelengths (λ), or higher


frequency waves (ν or f), are more energetic than those
of longer wavelengths, or lower frequencies
33
– An x-ray photon is more energetic than a light photon

Electromagnetic Energy
• Radio waves through gamma rays are all electromagnetic
(EM) waves
• These waves differ only in wavelength
• Visible light is only one form of electromagnetic energy
– Ultraviolet, x-rays, and gamma rays are shorter
– Infrared, microwaves, television, and radio waves are longer.
• An object of a certain size can scatter EM wavelengths on
the order of this size or smaller, but not larger wavelengths.
– Thus long wavelengths will not identify a small object
• Long wavelength radiation can only measure distances and
objects on the order of the wavelength
– Infrared light of micrometer wavelength will resolve better than
34
decimeter wavelength radio waves

17
Visible Light Bands
• This narrow band of electromagnetic radiation
extends from about 400 nm (violet) to about 700 nm
(red).
• The various color components of the visible spectrum
fall roughly within the following wavelength regions:
– Red: 610 - 700 nm
– Orange: 590 - 610 nm
– Yellow: 570 - 590 nm
– Green: 500 - 570 nm
– Blue: 450 - 500 nm
– Indigo: 430 - 450 nm
– Violet: 400 - 430 nm
(Virtual Science Centre) 35

Infrared Bands
• Infrared ranges from 0.7 to 300 µm wavelength.
• This region is further divided into the following
bands:
– Near Infrared (NIR): 0.7 to 1.5 µm.
– Short Wavelength Infrared (SWIR): 1.5 to 3 µm.
– Mid Wavelength Infrared (MWIR): 3 to 8 µm.
– Long Wavelength Infrared (LWIR): 8 to 15 µm.
– Far Infrared (FIR): longer than 15 µm.
• The NIR and SWIR bands are also known as
reflected infrared, referring to the main infrared
component of the solar radiation reflected from the
earth's surface.
• The MWIR and LWIR are known as thermal infrared

(Virtual Science Centre) 36

18
ELECTROMAGNETIC RADIATION AND THE
ELECTROMAGNETIC SPECTRUM
EMR is a dynamic form of energy that propagates as wave
motion at a velocity of c = 3 x 1010 cm/sec. The parameters
that characterize a wave motion are wavelength (λ), frequency
(ν) and velocity (c) (Fig. below). The relationship between the
above is c = νλ.

Figure : Electromagnetic wave. It has two components, Electric field E and


Magnetic field M, both perpendicular to the direction of propagation 4
Electromagnetic energy radiates in accordance with the
basic wave theory. This theory describes the EM energy as
travelling in a harmonic sinusoidal fashion at the velocity
of light. Although many characteristics of EM energy are
easily described by wave theory, another theory known as
particle theory offers insight into how electromagnetic
energy interacts with matter. It suggests that EMR is
composed of many discrete units called photons/quanta.
The energy of photon is

Q = hc / λ = h ν
Where
Q is the energy of quantum,
h is Planck’s constant
5
The Electromagnetic Spectrum

(The Wave Nature of Light)


6
Principal Divisions of the Electromagnetic Spectrum

7
Major regions of the electromagnetic spectrum
Region Name Wavelength Comments
Entirely absorbed by the Earth's atmosphere and not
Gamma Ray < 0.03 nanometers
available for remote sensing.
Entirely absorbed by the Earth's atmosphere and not
X-ray 0.03 to 30 nanometers
available for remote sensing.
Wavelengths from 0.03 to 0.3 micrometers absorbed
Ultraviolet 0.03 to 0.4 micrometers
by ozone in the Earth's atmosphere.
Available for remote sensing the Earth. Can be
Photographic Ultraviolet 0.3 to 0.4 micrometers
imaged with photographic film.
Available for remote sensing the Earth. Can be
Visible 0.4 to 0.7 micrometers
imaged with photographic film.
Available for remote sensing the Earth. Can be
Infrared 0.7 to 100 micrometers
imaged with photographic film.
Available for remote sensing the Earth. Near Infrared
Reflected Infrared 0.7 to 3.0 micrometers 0.7 to 0.9 micrometers. Can be imaged with
photographic film.
Available for remote sensing the Earth. This
wavelength cannot be captured with photographic
Thermal Infrared 3.0 to 14 micrometers
film. Instead, mechanical sensors are used to image
this wavelength band.
Longer wavelengths of this band can pass through
Microwave or Radar 0.1 to 100 centimeters clouds, fog, and rain. Images using this band can be
made with sensors that actively emit microwaves.
Radio > 100 centimeters Not normally used for remote sensing the Earth. 8
Passive vs. Active Sensing

The sun as a source of energy or radiation. The sun provides a very


convenient source of energy for remote sensing. The sun's energy is either
reflected, as it is for visible wavelengths, or absorbed and then reemitted, as
it is for thermal infrared wavelengths. Remote sensing systems which
measure energy that is naturally available are called passive sensors.
Passive sensors can only be used to detect energy when the naturally
occurring energy is available. For all reflected energy, this can only take place
during the time when the sun is illuminating the Earth. There is no reflected
energy available from the sun at night. Energy that is naturally emitted (such
as thermal infrared) can be detected day or night, as long as the amount of
energy is large enough to be recorded.
Active sensors, on the other hand, provide their own energy source for
illumination. The sensor emits radiation which is directed toward the target to
be investigated. The radiation reflected from that target is detected and
measured by the sensor. Advantages for active sensors include the ability to
obtain measurements anytime, regardless of the time of day or season. Active
sensors can be used for examining wavelengths that are not sufficiently
provided by the sun, such as microwaves, or to better control the way a target
is illuminated. However, active systems require the generation of a fairly large
amount of energy to adequately illuminate targets. Some examples of active
sensors are a laser fluorosensor and a synthetic.
Remote Sensing

Energy Interactions with


Earth Systems
ELECTROMAGNETIC WAVES
► EMW is a form of energy that is reflected or emitted from objects
in the form of electrical and magnetic waves that can travel
through space.

► The electromagnetic waves that compose electromagnetic


radiation can be imagined as a self-propagating transverse
oscillating wave of electric and magnetic fields. This diagram
shows a plane linearly polarized EMR wave propagating from left
to right. The electric field is in a vertical plane and the magnetic
field in a horizontal plane. The electric and magnetic fields in EMR
waves are always in phase and at 90 degrees to each other.
►It can be detected only when it interacts with matter.
► There are many forms of electromagnetic energy including
gamma rays, x rays, ultraviolet radiation, visible light, infrared
radiation, microwaves and radio waves.
► Electromagnetic energy can be though of as either waves or
particles, known as photons.
► This energy is propagates through space in form of periodic
or sinusoidal disturbances of electric and magnetic fields
 In free space this is 299,792,458 meters/second (exact)
► The waves are characterized by frequency and wavelength,
related by:

c = 
where
 c = speed of light
 ν = frequency
 λ = wavelength,
usually in m (10-6 meters),
or in nm (10-9 meters)
RADIO WAVE :
kind of energy that radio stations emit into the air.
emitted by other things,such as stars and gases in space.
MICRO WAVE :
in space are used by astronomers to learn about the structure of
nearby galaxies, and our own Milky Way
cook your popcorn in just a few minutes!
INFRARED :
Our skin emits infrared light, which is why we can be seen in the
dark by someone using night vision goggles
IR light maps the dust between stars.
VISIBLE :
this is the part that our eyes see
emitted by everything from fireflies to light bulbs to stars also by
fast-moving particles hitting other particles.
ULTRA-VIOLET :
Stars and other "hot" objects in space emit UV radiation.
cause our skin to burn
X-RAY:
Your doctor uses them to look at your bones and your
dentist to look at your teeth
GAMMA-RAY :
Radioactive materials (some natural and others made by man in
things like nuclear power plants) can emit gamma-rays
Interactions with the Atmosphere
► Scattering, refraction, absorption
transmission (T) occurs
when radiation passes
through a target

Reflection (R) occurs when Absorption (A) occurs when


radiation "bounces" off the radiation (energy) is
target and is redirected. absorbed into the target
Interactions with the Atmosphere
► Scattering
► Refraction
► Absorption
The most important source of electromagnetic radiation is the sun. Before sun’s energy
reachaes the surface of the earth three fundamental interaction in the atmosphere are possible
. These are absorption, transmission , and scattering .Particles and gases in the atmosphere
can affect the incoming light and radiation.

1. SCATTERING
Scattering occurs when particles or
large gas molecules present in the
atmosphere interact with and cause the
electromagnetic radiation to be
redirected from its original path. How
much scattering takes place depends
on several factors including the
wavelength of the radiation, the
abundance of particles or gases, and
the distance the radiation travels
through the atmosphere. There are
three (3) types of scattering which take
place.
Scattering
► The redirection of EM energy by particles
suspended in the atmosphere or large molecules
of atmospheric gases

► Rayleigh scattering
► Mie scattering
► Nonselective scattering
http://ww2010.atmos.uiuc.edu/(Gh)/guides/mtr/opt/mch/sct.rxml
Rayleigh scattering occurs when particles are very small
compared to the wavelength of the radiation. These could be
particles such as small specks of dust or nitrogen and oxygen
molecules. Rayleigh scattering causes shorter wavelengths of
energy to be scattered much more than longer wavelengths.
Rayleigh scattering is the dominant scattering mechanism in the
upper atmosphere. The fact that the sky appears "blue" during
the day is because of this phenomenon. As sunlight passes
through the atmosphere, the shorter wavelengths (i.e. blue) of
the visible spectrum are scattered more than the other (longer)
visible wavelengths. At sunrise and sunset the light has to
travel farther through the atmosphere than at midday and the
scattering of the shorter wavelengths is more complete; this
leaves a greater proportion of the longer wavelengths to
penetrate the atmosphere.
Rayleigh Scattering
► It occurs when atmospheric particles' diameters
are much smaller than the wavelength of the
radiation d<<
► It is common high in the atmosphere
► Radiation with shorter wavelength is easier to be
scattered
► Black vs. blue vs. red skies
http://www-phys.llnl.gov/Research/scattering/RTAB.html
Mie scattering
This occurs when the particles are just about the same
size as the wavelength of the radiation. Dust, pollen,
smoke and water vapour are common causes of Mie
scattering which tends to affect longer wavelengths than
those affected by Rayleigh scattering. Mie scattering
occurs mostly in the lower portions of the atmosphere
where larger particles are more abundant, and dominates
when cloud conditions are overcast.
Mie Scattering
► Particles' diameters are equivalent to the
wavelength d ≈ 
► It is common in lower atmosphere

► It is wavelength dependent
Non Selective scattering
This occurs when the particles are much larger than
the wavelength of the radiation. Water droplets and
large dust particles can cause this type of scattering.
Nonselective scattering gets its name from the fact
that all wavelengths are scattered about equally. This
type of scattering causes fog and clouds to appear
white to our eyes because blue, green, and red light
are all scattered in approximately equal quantities
(blue + green + red light = white light).
Nonselective Scattering
► Particles are much larger than the wavelength
d>>
► All wavelength are scattered equally

Effects of scattering
► It causes haze in remotely sensed images
► It decreases the spatial detail on the images
► It also decreases the contrast of the images
Refraction
► The bending of light rays at the contact between
two media that transmit light but with different
density; when light enters the denser medium, it is
refracted toward surface normal
Absorption
► The atmosphere prevents, or strongly attenuates,
transmission of radiation through the atmosphere
► Three gases:
- Ozone (O3): absorbs ultraviolet radiation high in
atmosphere
- Carbon-dioxide (CO2): absorbs mid and far
infrared (13-17.5microm) in lower atmosphere
- Water vapor (H2O): absorbs mid-far infrared
(5.5-7.0, >27microm) in lower atmosphere
Atmospheric Windows
► Thosewavelengths that are relatively easily
transmitted through the atmosphere

http://www.crisp.nus.edu.sg/~research/tutorial/atmoseff.htm#windows
Atmospheric Windows
Atmospheric Windows
► The windows:
UV & visible: 0.30-0.75m
Near infrared: 0.77-0.91m
Mid infrared: 1.55-1.75m, 2.05-2.4m
Far infrared: 3.50-4.10m, 8.00- 9.20m,
10.2-12.4m
Microwave: 7.50-11.5mm, 20.0+mm

► The atmospheric windows are important for RS


sensor design
Interaction with Features
Reflection, absorption, and transmission
Interactions with Surface
► All EM energy reaches earth's surface must be
reflected, absorbed, or transmitted
► Each is represented by a rate (%)
► Their rate depends on:
type of features, wavelength, angle of illumination

Reflection Absorption Transmission


Reflection
► Light ray is redirected as it strikes a
nontransparent surface
► Spectral reflectance r =ER()/EI()
= (E of wavelength  reflected from the object)/
(E of wavelength  incident upon the object)
Reflection
► Specular reflection
When surface is smooth relative to the
wavelength, incident radiation is reflected in a
single direction
► incidence angle = reflection angle

► Diffuse (isotropic) Reflection


► When surface is rough relative to the wavelength,
energy is scattered equally in all directions
► Lambertian surface
Transmission
► Radiation passes through a substance without
significant attenuation

► Transmittance (t):
transmitted radiation
t = ---------------------------
incident radiation
Absorption

absorbed radiation
t = ---------------------------
incident radiation
Interactions

All features at the earth’s surface interact with EM


energy all three ways but with different
proportions

Reflection + Transmission + Absorption = 100%


Emission

http://www.crisp.nus.edu.sg/~research/tutorial/infrared.htm
Spectral Characteristics of Features

http://www.crisp.nus.edu.sg/~research/tutorial/infrared.htm
Spectral Reflectance Curve
Vegetation
► Chlorophyll absorbs blue and red, reflects
green
► Vegetation has a high reflection and
transmission at NIR wavelength range
► Reflection or absorption at MIR range, the
water absorption bands

From http://rst.gsfc.nasa.gov/Intro/nicktutor_I-3.html
Vegetation
► The palisade cells absorb blue and red light and
reflect green light at a peak of 0.54mm
► The spongy mesophyll cells reflect near infrared
light that is related to vegetation biomass because
the intercellular air space of spongy mesophyll
layer is where photosynthesis and respiration
occur
► Vegetation moisture content absorbs mid infrared
energy
► Jensen, J. R. "Biophysical Remote Sensing." Annals, 73:(1),111-132.
Biophysical Sensitivity of Spectrums ..

Upper epidermis

Palisade

Spongy
mesophyll
Lower epidermis
http://www.cstars.ucdavis.edu/projects/modeling/
Soils
► Soil moisture decreases reflectance
► Coarse soil (dry) has relatively high reflectance
► Surface roughness, organic matter, iron oxide
affect reflectance
Water

► Transmission at visible bands and a strong


absorption at NIR bands
► Water surface, suspended material, and bottom
of water body can affect the spectral response
Absorption

From http://rst.gsfc.nasa.gov/Intro/nicktutor_I-3.html

You might also like