Sensors and Transducers
Sensors and Transducers
• Measurement
• Measurement System
• Elements Of A Measurement System
• Selection Criteria Of Measurement System
• Instrument Types
• Characteristics Of Instruments
• Questions
Measurement
In measurement theory
used to observe
at far off places (remote location)
first element in a measuring system is the primary sensor: this gives an output
that is a function of the measurand (the input applied to it).
Variable conversion elements are needed where the output variable of a primary
transducer is in an inconvenient form and has to be converted to a more
convenient form.
the primary sensor and variable conversion element are combined, and the
combination is known as a transducer.
The accuracy of these two instruments depends on different things. For the first
one it depends on the linearity and calibration of the spring, whilst for the second
it relies on the calibration of the weights. As calibration of weights is much easier
than careful choice and calibration of a linear-characteristic spring, this means
that the second type of instrument will normally be the more accurate. This is in
accordance with the general rule that null-type instruments are more accurate
than deflection types.
In terms of usage, the deflection type instrument is clearly more convenient. It is
far simpler to read the position of a pointer against a scale than to add and subtract
weights until a null point is reached. A deflection-type instrument is therefore the
one that would normally be used in the workplace. However, for calibration duties,
the null-type instrument is preferable because of its superior accuracy. The extra
effort required to use such an instrument is perfectly acceptable in this case
because of the infrequent nature of calibration operations.
those that merely give an audio or visual indication of the magnitude of the
physical quantity measured and those that give an output in the form of a
measurement signal whose magnitude is proportional to the measured quantity.
it does not really matter whether the true temperature of the room is 19.5°C
or 20.5°C or with an inaccuracy of ±0.5°C because such small variations
around 20°C are too small to affect whether we feel warm enough
or not. Our bodies cannot discriminate between such close levels of
temperature
but
a variation of 0.5°C might have a significant effect on the rate of reaction or
even the products of a process. A measurement inaccuracy much less than
š0.5°C is therefore clearly required
Precision
Precision is a term that describes an instrument’s degree of freedom from
random errors. If a large number of readings are taken of the same quantity by a
high precision instrument, then the spread of readings will be very small.
Reproducibility describes the closeness of output readings for the same input when
there are changes in the method of measurement, observer, measuring instrument,
location, conditions of use and time of measurement.
The terms repeatability and reproducibility mean approximately the same but are
applied in different contexts as given below.
If the input to an instrument is gradually increased from zero, the input will have to
reach a certain minimum level before the change in the instrument output reading
is of a large enough magnitude to be detectable.
This means that, if the vehicle starts from rest and accelerates, no output reading is
observed on the speedometer until the speed reaches 15 km/h.
Resolution
When an instrument is showing a particular output reading, there is a lower limit on
the magnitude of the change in the input measured quantity that produces an
observable change in the instrument output.
One of the major factors influencing the resolution of an instrument is how finely its
output scale is divided into subdivisions.
This means that when the needle is between the scale markings, we cannot
estimate speed more accurately than to the nearest 5 km/h.
Such environmental changes affect instruments in two main ways, known as zero drift
and sensitivity drift.
Zero drift or bias describes the effect where the zero reading of an instrument is
modified by a change in ambient conditions. This causes a constant error that exists
over the full range of measurement of the instrument.
If someone of known weight 70 kg were to get on the scale, the reading would be 71 kg,
and if someone of known weight 100 kg were to get on the scale, the reading would be
101 kg. Zero drift is normally removable by calibration.
Sensitivity drift (also known as scale factor drift) defines the amount by which an
instrument’s sensitivity of measurement varies as ambient conditions change.
Hysteresis effects
Two quantities are defined, maximum input hysteresis and maximum output
hysteresis, as shown in the figure.
These are normally expressed as a percentage of the full-scale input or output
reading respectively.
Expected Questions
5. Define sensitivity drift and zero drift. What factors can cause sensitivity drift
and zero drift in instrument characteristics?
1.
2.
3.