Module BIPS
Module BIPS
Module BIPS
STUDY MODULE
For
By
UNIT CONTENT
2. Fingerprint Biometrics
1.1 Historical Background of fingerprint identification.
UNIT - 2 1.2 Types of fingerprint pattern
1.3 Classification of fingerprint pattern.
1.4 Fingerprint recognition process
3. Ear Biometrics
3.1 Historical Background of Ear Biometrics
UNIT - 3 3.2 Physiology of ear
3.3 Classification of ear pattern
3.4 Ear-print recognition process
4. DNA identification
4.1 Historical Background of DNA for personal identification
UNIT - 4 4.2 Biology of DNA
4.3 Techniques of DNA
4.4 Case Law based on DNA evidence
1
BIOMETRICS IN PERSONAL IDENTIFICATION
B. Criminal Cases
(i) Identification of accused in criminal cases of
assault murder
dacoity sexual offenses
Absconding soldiers
(ii) Interchange of new born babies in hospital
(iii) Criminal abortion
(iv) To fix up age of criminal responsibility and majority
(v) Impersonation in criminal cases
2. In Dead:
The need to identify the dead is obvious for social and medico-legal purposes. It is required in
cases of natural mass disasters like earth quakes, tsunamis, landslides, floods etc., and in man-
made disasters like bomb explosions, fire, air crash, building collapse, railway accidents or
bodies recovered from sea, rivers, canals, wells and in cases when the body is highly
decomposed or dismembered to deliberately conceal the identity of the individual.
The term ‘Corpus Delicti’ means the body of offence or the body of crime. In a charge of
homicide, it includes:
The interest of the community in the scene of death, after the discovery of remains or after a
mass disaster, is often overwhelming. The disturbance of scene by curiosity seekers or by ill-
trained police personnel may preclude not only accurate identification of bodies but also
complete collection of physical evidence. This invites the “Law of Multiplicity of Evidences’
to play its role wherever called for. The Supreme Court has laid down that in law, a conviction,
for an offence does not necessarily depend upon the ‘Corpus Delicti’ being proved. The cases
may be conceivable where the discovery of the dead body, from the very nature of the case, is
impossible. Therefore, it may be said that the existence of the dead body of the victim is no
doubt a proof positive of death but its absence not fatal to the trial of the accused for the
homicide. Indeed, any other view would place in the hands of the accused an incentive to
destroy the body after committing murder and thus secure immunity for his crime.
The examination of a person for the purpose of identification should not be undertaken without
obtaining his free consent, and at the same time it should be explained to him that the facts
noted might go in evidence against him. It should be remembered that consent given before the
police is of no account, and that the law does not oblige anyone to submit to examination
against his will and thus furnish evidence against himself.
Using biometric data for classification and/or identification in forensic science dates back to
the turn of the 20th century. Biometrics as we know it today can be viewed as extension of
Bertillon's anthropometric approach, benefiting from automation and the use of additional
features. This chapter presents a historical and technical overview of the development and the
evolution of forensic biometric systems, used initially manually and then in a semi-automatic
way. Before focusing on specific forensic fields, we will define the area, its terminology and
draw distinctions between forensic science and biometrics.
Forensic science refers to the applications of scientific principles and technical methods to an
investigation in relation to criminal activities, in order to establish the existence of a crime, to
determine the identity of its perpetrator(s) and their modus operandi. It is thus logical that this
area was a fertile ground for the use of physiological or behavioural data to sort and potentially
individualize protagonists involved in offences. Although manual classification of physical
measures (anthropometry), and of physical traces left and recovered from crime scenes
(fingermarks, earmarks,) was largely successful, an automatic approach was needed to
facilitate and to speed up the retrieval of promising candidates in large databases. Even if the
term biometrics usually refers “to identifying an individual based on his or her distinguishing
characteristics", biometric systems in forensic science today aim at filtering potential
candidates and putting forward candidates for further one to one verification by a forensic
specialist trained in that discipline, in the following traditional typical cases (here exemplified
using fingerprints):
Case 1: A biometric set of features in question coming from an unknown individual (living or
dead), is searched against a reference set of known (or declared as such) individuals. In the
fingerprint domain, we can think of a ten-print to ten-print search based on features obtained
from a tenprint card (holding potentially both rolled and flap inked impression from fingers
and palms), compared to a database of ten-print cards.
Both case 2 and case 3 involve biometric features (in physical or other forms) that can be left
on scenes relevant to an investigation. In forensic investigation, one of the main objectives is
to find marks associating an offender to an event under investigation. These marks can be either
left by the perpetrator during the event or found on the perpetrator after it. This mechanism of
“exchange” of marks is known under the misnomer of “Locard's exchange principle” in
reference to the French criminalist Edmond Locard. Forensic information can be found either
as physical marks, or as digital traces. Physical marks are made for example by the apposition
of fingers, ears or feet on any kind of surfaces, while digital traces are analog or digital
recordings typically from phone-tapping and security cameras. Face and speech biometrics,
and to some extent modalities captured at distance such as ear, iris and gait can be used as
digital traces in forensic science.
As a first distinction between biometrics and forensic science, it is important to stress that
forensic biometric systems are used in practice as sorting devices without any embedded
decision mechanism on the truthfulness of the identification (although we do see some
developments in that direction). Indeed, the search algorithms are deployed as sorting devices.
These ranking tools allow (at an average known rate of efficiency) presenting the user a short
list (generally 15 to 20) containing potentially the right candidate to a defined query. Here the
term ‘candidate’ refers to the result of a search against biometric features originating from
either individuals or marks (known or unknown). It is then the duty of the forensic specialist to
examine each candidate from the list as if that candidate was submitted through the regular
channels of a police inquiry. This first contrast shows that forensic biometric systems are
considered by forensic scientists as external to the inferential process that will follow.
The second distinction lies in the terminology, performance measures and reported
conclusions used in the processes. Although forensic biometric systems can be used in both
verification (one to one) or identification modes (one to many), depending on the circumstances
of the case, the identification mode can be seen as a series of verification tasks. The reported
conclusion by the forensic specialist when comparing an unknown to a known entry can take
different forms depending on the area considered.
In the fingerprint field, conclusions can take three states: individualization, exclusion or
inconclusive. The first two are categorical conclusions accounting for all possible entities on
the Earth. In other words, an individualization of a finger mark is a statement that associates
that mark to its designated source to the exclusion of all other fingers or more generally all
other friction ridge skin formations. Individualization is often presented as the distinguishing
factor between forensic science and other scientific classification and identification tasks.
In the fields of face or ear recognition carried out manually by skilled examiners, speaker
verification based on phonetic/linguistic analysis, dental analysis or handwriting examination,
the three conclusions described above will remain under the same definition, but probabilistic
conclusions will also be allowed on a grading scale both in favour or against identity of sources
with qualifiers such as: possible, probable or very likely. For a discussion of the adequacy of
the scale in forensic decision making refer to.
The principles and protocols regarding how these conclusions (outside the DNA area) can be
reached by a trained and competent examiner is outside our scope. However, the general
principles of the inference of identity of sources are treated in detail by Kwan or by Champod
et al. (for fingerprints). In all these areas, based on different features, the expert subjectively
weighs the similarities and dissimilarities to reach his/her conclusion. Nowadays the reliability
of these so-called \subjective disciplines" are being increasingly challenged, especially because
of
(i) the development of evidence based on DNA profiles governed by hard data and
(ii) the evolving requirements for the admissibility of evidence following the Daubert
decision by the Supreme Court of the USA1.
The absence of underpinning statistical data in the classic identification fields is viewed as a
main pitfall that requires a paradigm shift.
In the field of DNA, the strength of evidence is indeed generally expressed statistically using
case specific calculations linked to a likelihood ratio (defined later). In essence the process is
probabilistic although we do see some tendencies to remove uncertainty from the debate. It is
our opinion that inferences of sources across all forensic identification fields, when put forward
to a factfinder in court for example, must be approached within a probabilistic framework even
in areas that had been traditionally presented through categorical opinions such as fingerprints.
An approach based on the concept of likelihood ratio should be promoted.
In- deed, a likelihood ratio (LR) is a statistical measure that offers a balanced presentation of
the strength of the evidence. It is especially suitable for assessing the contribution of forensic
findings in a fair and balanced way. Note that we restrict our analysis to an evaluative context,
meaning that the forensic findings may be used as evidence against a defendant in court. There
is a wide scope of application of biometric systems in investigative mode (e.g., surveillance)
that we will not cover. Formally, the LR can be defined as follows:
LR = p (E j S; I)
P (E j ¹ S; I)
Where:
E: Result of the comparison (set of concordances and discordances or a similarity measure
such as a score) between the biometric data from the unknown source and the biometric data
from the putative source.
S: The putative source is truly the source of the unknown biometric features observed (also
known as the prosecution proposition).
¹ S: Someone else, from a relevant population of potential donors, is truly the source of the
unknown biometric features observed (also known as the defence proposition).
I: Relevant background information about the case such as information about the selection of
the putative source and the nature of the relevant population of potential donors.
This LR measure forces the scientist to focus on the relevant question (the forensic findings)
and to consider them in the light of a set of competing propositions. The weight of forensic
findings is essentially a relative and conditional measure that helps to progress a case in one
direction or the other depending on the magnitude of the likelihood ratio. When the numerator
is close to 1, the LR is simply the reverse of the random match probability (RMP) in a specified
population. In these cases, reporting the evidence through the RMP is adequate. However most
biometric features suffer from within individual variability facing an assessment of the
numerator on a case by case basis.
The performance measures for forensic science are obtained from the analysis of the
distributions of the LRs in simulated cases with given S and ¹ S. These distributions are studied
using a specific plot (called Tippett plot) that shows one minus the cumulative distribution for
respectively the LRs computed under S and the LRs computed under ¹ S. These plots also allow
study and comparison of the proportions of misleading evidence: the percentage of LR<1 when
the prosecution proposition S is true and the percentage of LR>1 when the defence proposition
¹S is true. These two rates of misleading results are defined as follows:
RMED: Rate of misleading evidence in favour of the defence: among all LRs computed under
the prosecution proposition S, proportion of LR below 1.
RMEP: Rate of misleading evidence in favour of the prosecution: among all LRs computed
under the defence proposition ¹S, proportion of LR above 1.
Whereas a LR is a case-specific measure of the contribution of the forensic findings to the
identity of sources, the Tippett plot and the associated rates (RMED, RMEP) provide global
measures of the efficiency of a forensic biometric system. LR based measures are now regularly
used in the forensic areas of speaker recognition, fingerprints, and DNA. That constitutes a
major difference compared to standard global measures of biometric performances based on
type I and type II error rates (e.g., Receiver Operating Characteristic (ROC) or Detection Error
Trade off (DET) curves). For a discussion on the limitations associated with these traditional
measures when used in legal proceedings, see.
The concept of identity of sources is essential and needs to be distinguished from the
determination of civil identity (e.g., assigning the name of a donor to a recovered mark), from
guidance as to the activities of the individual or its further unlawful nature. Forensic
comparisons aim initially at providing scientific evidence to help address issues of identity of
sources of two sets of biometric data; whether these data are coupled with personal information
(such as name, date of birth or social security number) is irrelevant for the comparison process.
From the result of this comparison and depending on the availability and quality of personal
information, then inference as to the civil identity can be made if needed. Likewise, there is a
progression of inferences between the issue of identity of sources towards their alleged
activities and offences. It is a hierarchical system of issues as described by Cook et al.
The forensic biometric comparison process aims at handling source level issues as its primary
task: the whole process is not about names or identity, but in relation to source attribution
between two submitted sets of features (respectively from a source 1 and a source 2).
A third distinction lies in the wide range of selectivity of the biometric data that can be
submitted due to varying quality of the material. Selectivity here can be seen as the
discrimination power of the features, meaning the ability to allow a differentiation when they
are coming from distinct sources. Some of the main modalities will be reviewed in the next
sections but there is an all-encompassing phenomenon that goes across modalities in varying
degrees. In the commission of a crime, contrary to usual biometric systems (for access control
e.g.,), it may not be possible to obtain high quality input biometric features - either for the
template or transaction data. These biometric data are limited by numerous factors such as: the
availability of the person and his/her level of cooperation, the invasiveness of the acquisition,
the various objects and positions one can take or touch while a crime is being committed. The
subjects make no effort to present their biometric data to the system in an ideal and controlled
way. Hence, whether the biometric data is acquired directly from individuals (living or dead),
from images (of individuals, part thereof or X-rays) or marks left by them following criminal
activities, the quality of the material available for the biometric comparison process, and thus
its selectivity, may vary drastically from case to case and so will the within-person variability.
The overall performance of the system is largely influenced by the quality of the input data
conditioned by the acquisition and environmental conditions as summarized in Table 1.1.
These factors are common in all biometric deployments, but forensic scenarios tend to
maximize their variability.
The last distinction we would like to stress upon is the range of comparisons that can be
undertaken in the forensic environment depending on the circumstances of the cases. The three
cases outlined initially all deal with comparisons of biometric information (with one side or
the other being known) but at differing levels of selectivity. The driving force here is more the
selectivity level associated with each compared biometric data sets, which can be similar (case
1 and case 3) or largely different (case 2). The availability of known information, such as the
name, the date of birth, the social security number (i.e. the personal data associated with each
compared biometric data set), associated with the biometric features is not directly relevant to
the comparison process. This information is although decisive to progress in the hierarchy, but
has no impact on the decision of the identity of sources, which is driven by the selectivity of
the compared biometric data. The distinction between mark and reference material in a forensic
case is that in general, marks are of lower quality than reference material (although the reverse
could also be true). This concept of selectivity that is driving the move from case 1 to case 3
is a continuum on both sides (source 1 and source 2). Essentially, we can expect performances
to degrade as we move down in selectivity levels.
Table: List of the factors affecting the selectivity of biometric information and thus, the
performances of biometric systems deployed in forensic applications
Acquisition Quality of the acquisition device (e.g., resolution). Amount of input
conditions information (e.g., a rolled inked fingerprint on a card versus a limited
poorly developed finger-mark on a curved surface). The availability of
multiple templates. The types of acquisition of both templates and
transaction data. The acquisition at distance, the target size, the object
movement, and the horizontal or vertical misalignments between the
device and the subject. Presence of corrupting elements (e.g., glasses,
beard, hair, clothes, or health condition - living or dead – of the subject).
The time interval between the acquisitions of both sets of biometric
material to be compared.
Environmental Background noise and uncontrolled conditions (e.g., illumination, noisy
conditions environment).
Data processing The choice of the feature extraction algorithms and their level of
automation (e.g., poor quality fingermarks may need to be manually
processed by skilled operator in order to guide the system as to the
relevant features). Efficiency of the detection and tracking algorithms
(e.g., face detection and tracking). The matching algorithms in place and
their hierarchy.
Operator The operator interaction with the system at all stages (from acquisition to
verification of candidates' lists).
Fig. General scheme of a forensic biometric system.
Historical Background
The use of Biometrics has been traced back as far as the Egyptians, who measured people to
identify them. Alphonse Bertillon, Chief of the criminal identification division of police
department in Paris, developed and practiced the idea of using a number of body measurements
to identify criminals in the mid-19th century. These measurements were written on cards that
could be sorted by height, length of arm or any other parameter. After this, the use of
fingerprints, ears, face etc. starts by many law enforcement agencies to determine the identity
of the criminals. Such technology now provides for the capture and processing of biometrics
information. In following table there is a year-wise description of biometrics technology.
YEAR DESCRIPTION
1858 First systematic capture of hand images for identification purposes was
recorded.
1875 Schwable was the first to invent a method to measure the external ear for
personal identification.
1879 Bertillon developed anthropometrics to identify individuals.
1892 Galton developed a classification system for fingerprints.
1896 Henry developed a fingerprint classification system.
1903 Bertillon system collapsed.
1936 Concept of using the iris pattern for identification was proposed.
1960 Face recognition became semi-automated.
1960 First modal of acoustic speech production was created.
1965 Automated signature recognition research began.
1969 FBI pushed to make fingerprint recognition an automated process.
1970 Face recognition took another step towards automation.
1970 Behavioural components of speech were first modelled.
1974 First commercial hand geometric system became available.
1975 FBI funded development of sensors and minutiae extracting technology.
1976 First prototype system for speaker recognition was developed.
1977 Patent was awarded for acquisition of dynamic signature information.
1985 Concept that no two irises are alike was proposed.
1985 Patent for hand identification was awarded.
1987 Patent stating that the iris can be used for identification was awarded.
1988 First semi-automated facial recognition system was developed.
1988 Eigenface technique was developed for face recognition.
1989 Iannarelli designed a useful primary and secondary classification system of
external ear.
1991 Face detection was pioneered, making real time face recognition possible.
1992 Biometrics consortium was established within U.S Government.
1993 Development of an iris prototype unit begins.
1993 Face recognition technology (FERET) program was initiated.
1994 First iris recognition algorithm was patented.
1994 Integrated automated fingerprint identification system (IAFIS) competition was
held.
1994 Palm system was benchmarked.
1995 Iris prototype becomes available as a commercial product.
1996 Hand geometry was implemented at the Olympic games.
1996 NIST began hosting annual speaker recognition evolutions.
1998 FBI launched CODIS (DNA forensic database).
1999 Study on the compatibility of Biometrics and machines readable travel
documents were launched.
1999 FBI‟s IAFIS major components became operational.
2000 First face recognition vendor test (FRVT 2000) was held.
2002 ISO/IEC standards subcommittee on Biometrics was established.
2002 M1 technical committee on Biometrics was formed.
2003 ICAO adopted blue print to integrate Biometrics into machine readable travel
documents.
2003 European Biometrics forum was established.
2004 DOD implemented ABIS.
2004 First state-wide automated palm print data base was deployed in the US.
2004 Face recognition grand challenge began.
2005 US patent on iris recognition concept expired.
2006 Phalguni adopted a simple geometric approach for ear recognition.
2006 Jeges automated model-based human ear identification.
2007 Yuan proposed ear detection based on skin-colour and contour information.
2007 Xiaoxun proposed symmetrical null space LDA for face and ear recognition.
2008 Xie introduced ear recognition using LLE and IDLLE Algorithm.
2009 Islam proposed Score Level Fusion of Ear and Face Local 3D features for fast
and Expression-Invariant Human Recognition.
2009 Ear Localization using Hierarchical Clustering was developed in IIT Kanpur.
2010 A Survey on Ear Biometrics was conducted in the West Virginia University.
2010 FBI funded for twin’s iris study in the UND.
2010 Shaped Wavelets for Curvilinear Structures for Ear Biometrics in the UK.
2011 Joshi studied on Edge Detection and Template Matching Approaches for
Human Ear Detection in India.
2011 Automated human identification using ear imaging was developed in Hong
Kong.
2012 An Efficient Ear Localization Technique was developed in IIT Kanpur.
2012 Region-Based features extraction in ear biometrics was developed in Malaysia.
Biometric System
A Biometrics system is essentially a pattern recognition system that recognizes a person-based
mon some specific physiological or behavioural characteristic unique to that person.
Biometrics authentication is a major part in the information technology field and it refers for
automatic process.
Enrolment is done before biometrics can be used for identification, a trusted sample of the
biometrics trait should be captured using a biometrics sensor and pre-processed so that the
approach used for recognition can be applied to the sample.
Verification Mode refers to the problem confirming or denying a person’s claimed identity
(Am I who I claim I am?), Identification (Who am I?). The system validates a person’s identity
by comparing the captured biometrics data with his/her own biometrics templates stored in the
system database. In such a system, a basic identity, usually a PIN, a user name, a smart card
etc. is accepted and a biometrics template of the subject taken is matched using a 1:1 matching
algorithm to confirm the person’s identity.
Identification Mode is a generic term, and does not necessarily imply either verification or
identification. All biometrics systems perform “recognition” to “again know” a person who has
been previously enrolled verification is a task where the biometrics system attempts to confirm
an individual’s claimed identity by comparing a submitted sample to one or more previously
enrolled templates.
Fig. Enrolment and recognition (verification and identification)
stages of a biometric system.
Biometric Types
Biometrics measures biological characteristics for identification or verification purposes of an
individual. Since IDs and passports can be forged, more sophisticated methods needed to be
put into place to help protect companies and individuals. There are two types of biometric
methods. One is called Physiological biometrics used for identification or verification
purposes. Identification refers to determining who a person is This method is commonly used
in criminal investigations. Behavioural biometrics is the other type. It is used for verification
purposes. Verification is determining if a person is who they say they are. This method looks
at patterns of how certain activities are performed by an individual.
3.1.1 Fingerprints
A fingerprint is a pattern of ridges and furrows located on the tip of each finger. Fingerprints
were used for personal identification for many centuries and the matching accuracy was very
high. Patterns have been extracted by creating an inked impression of the fingertip on paper.
Today, compact sensors provide digital images of these patterns. Fingerprint recognition for
identification acquires the initial image through live scan of the finger by direct contact with a
reader device that can also check for validating attributes such as temperature and pulse. Since
the finger actually touches the scanning device, the surface can become oily and cloudy after
repeated use and reduce the sensitivity and reliability of optical scanners. This method is
traditional and it gives accuracy for currently available Fingerprint Recognition Systems for
authentication.
Is difficult to forge: Vascular patterns are difficult to recreate because they are inside the hand
and, for some approaches; blood needs to flow to register an image.
Is contact-less: Users do not touch the sensing surface, which addresses hygiene concerns and
improves user acceptance.
Ear Biometrics
Using ear in personal identification has been interesting at least 100 years. Ear does not change
considerably during human life, on the other hand face changes more significantly with age
than any other part of human body. Ear features are relatively fixed and unchangeable. It has
been suggested that the shape of the ear and the structure of the cartilaginous tissue of the pinna
are distinctive. Matching the distance of salient points on the pinna from a land mark location
of the ear is the suggested method of recognition.
Fig. Ear Recognition
3.1.3 Iris
The iris begins to form in the third month of gestation and the structures creating its pattern are
largely complete by the eight months. Its complex pattern can contain many distinctive features
such as arching ligaments, furrows, ridges, crypts, rings, corona, freckles and a zigzag collaret.
Iris scanning is less intrusive than retinal because the iris is easily visible from several meters
away. Responses of the iris to changes in light can provide an important secondary verification
that the iris presented belongs to a live subject. Irises of identical twins are different, which is
another advantage.
3.1.4 Face
Facial recognition is the most natural means of biometric identification. The approaches to face
recognition are based on shape of facial attributes, such as eyes, eyebrows, nose, lips, chin and
the relationships of these attributes. As this technique involves many facial elements; these
systems have difficulty in matching face images.
3.2.1 Keystroke
Keyboard- is the part that helps us to communicate with computer. People use keyboard in
different ways. Some people type fast, some slow. The speed of the typing also depends on the
mood of a person and a time of a day. Biometric keystroke recognition – is a technology of
recognizing people from the way they are typing. It is rather important to understand that this
technology does not deal with “what” is written but “how” it is written.
Fig. Keystroke Recognition
3.2.2 Signature
The way a person signs his or her name is known to be characteristic of that individual.
Signature is a simple, concrete expression of the unique variations in human hand geometry.
Collecting samples for this biometric includes subject cooperation and requires the writing
instrument. Signatures are a behavioural biometric that change over a period of time and are
influenced by physical and emotional conditions of a subject. In addition to the general shape
of the signed name, a signature recognition system can also measure pressure and velocity of
the point of the stylus across the sensor pad.
3.2.3 Voice
The features of an individual's voice are based on physical characteristics such as vocal tracts,
mouth, nasal cavities and lips that are used in creating a sound. These characteristics of human
speech are invariant for an individual, but the behavioural part changes over time due to age,
medical conditions and emotional state. Voice recognition techniques are generally categorized
according to two approaches: 1) Automatic Speaker Verification (ASV) and 2) Automatic
Speaker Identification (ASI). Speaker verification uses voice as the authenticating attribute in
a two-factor scenario.
2
Fingerprint Biometrics
Ancient History
Earthenware estimated to be 6000 years old was discovered at an archaeological site in
northwest China and found to bear clearly discernible friction ridge impressions. These prints
are considered the oldest friction ridge skin impressions found to date; however, it is unknown
whether they were deposited by accident or with specific intent, such as to create decorative
patterns or symbols. In this same Neolithic period, friction ridges were being left in other
ancient materials by builders. Just as someone today might leave impressions in cement, early
builders left impressions in the clay used to make bricks.
221 B.C. to A.D. 1637
The Chinese were the first culture known to have used friction ridge impressions as a means
of identification. The earliest example comes from a Chinese document entitled “The Volume
of Crime Scene Investigation—Burglary”, from the Qin Dynasty (221 to 206 B.C.). The
document contains a description of how handprints were used as a type of evidence.
During the Qin through Eastern Han dynasties (221 B.C. to 220 A.D.), the most prevalent
example of individualization using friction ridges was the clay seal. Documents consisting of
bamboo slips or pages were rolled with string bindings, and the strings were sealed with clay.
On one side of the seal would be impressed the name of the author, usually in the form of a
stamp, and on the other side would be impressed the fingerprint of the author. The seal was
used to show authorship and to prevent tampering prior to the document reaching the intended
reader. It is generally recognized that it was both the fingerprint and the name that gave the
document authenticity.
The use of friction ridge skin impressions in China continued into the Tang Dynasty (A.D.
617–907), as seen on land contracts, wills, and army rosters. It can be postulated that with the
Chinese using friction ridge skin for individualization and trading with other nations in Asia,
these other nations might have adopted the practice. For example, in Japan, a “Domestic Law”
enacted in A.D. 702 required the following: “In case a husband cannot write, let him hire an-
other man to write the document and after the husband’s name, sign with his own index finger”.
This shows at least the possibility that the Japanese had some understanding of the value of
friction ridge skin for individualization.
Additionally, in India, there are references to the nobility using friction ridge skin as signature:
In A.D. 1637, the joint forces of Shah Jahan and Adil Khan, under the command of Khan
Zaman Bahadur, invaded the camp of Shahuji Bhosle, the ruler of Pona (in the present-day
Maharashtra). The joint army defeated Shahuji, who was compelled to accept the terms of
peace:
Since the garrison (of Shahuji) was now reduced to great extremities. Shahuji wrote frequently
to Khan Bahadur in the humblest strain, promising to pay allegiance to the crown. He at the
same time solicited a written treaty ... stamped with the impression of his hand.
The above text is an example of the nobility’s use of palmprints in India to demonstrate
authenticity of authorship when writing an important document. It is believed that the use of
prints on important documents was adopted from the Chinese, where it was used generally, but
in India it was mainly reserved for royalty. The use of friction ridge skin as a signature in China,
Japan, India, and possibly other nations prior to European discovery is thus well documented.
19th Century
In his 1823 thesis titled “Commentary on the Physiological Examination of the Organs of
Vision and the Cutaneous System”, Dr. Johannes E. Purkinje (1787–1869), professor at the
University of Breslau in Germany, classified fingerprint patterns into nine categories and gave
each a name. Although Dr. Purkinje went no further than naming the patterns, his contribution
is significant because his nine pattern types were the precursor to the Henry classification
system.
German anthropologist Hermann Welcker (1822–1898) of the University of Halle led the way
in the study of friction ridge skin permanence. Welcker began by printing his own right hand
in 1856 and then again in 1897, thus gaining credit as the first person to start a permanence
study.
Generally, the credit for being the first person to study the persistence of friction ridge skin
goes to Sir William James Herschel. In 1858, he experimented with the idea of using a
handprint as a signature by having a man named Rajyadhar Konai put a stamp of his right hand
on the back of a contract for road binding materials. The contract was received and accepted
as valid. This spontaneous printing of Konai’s hand thus led to the first official use of friction
ridge skin by a European.
Upon his appointment as Magistrate and Collector at Hooghly, near Calcutta, in 1877, Herschel
was able to institute the recording of friction ridge skin as a method of individualization on a
widespread basis. Herschel was in charge of the criminal courts, the prisons, the registration of
deeds, and the payment of government pensions, all of which he controlled with fingerprint
identification. On August 15, 1877, Herschel wrote what is referred to as the “Hooghly Letter”
to Bengal’s Inspector of Jails and the Registrar General, describing his ideas and suggesting
that the fingerprint system be expanded to other geographical areas. While proposing even
further uses of this means of individualization, the Hooghly Letter also explained both the
permanence and uniqueness of friction ridge skin. Henry Faulds was the first person to publish
in a journal the value of friction ridge skin for individualization, especially its use as evidence.
The scientific study of friction ridge skin was also taken up by a prominent scientist of the time,
Sir Francis Galton. Galton continued to take anthropometric measurements, and he added the
printing of the thumbs and then the printing of all 10 fingers. As the author of the first book on
fingerprints, Galton established that friction ridge skin was unique and persistent. Because
Galton was the first to define and name specific print minutiae, the minutiae became known as
Galton details.
In 1894, Sir Edward Richard Henry, Inspector General of Police for the Lower Provinces,
Bengal, collabo-rated with Galton on a method of classification for finger-prints. With the help
of Indian police officers Khan Bahadur Azizul Haque and Rai Bahaden Hem Chandra Bose,
the Henry classification system was developed. Once the classification system was developed
and proved to be effective, Henry wrote to the government of India asking for a comparative
review of anthropometry and fingerprints.
The first ever Finger Print Bureau in the world was established at Writer's Building at Calcutta
(now Kolkata) in the year 1897, on being convinced of the infallibility and reliability of finger
impression as means of identification, made this evidence admissible in the court of law under
section 45 of Indian Evidence Act (1872), and specifically mentioned therein.
A criminal case in Bengal in 1898 is considered to be the first case in which fingerprint
evidence was used to secure a conviction.
20th Century
The first trial in England that relied on fingerprint evidence involved Inspector Charles
Stockley Collins of Scotland Yard. Collins testified to an individualization made in a burglary
case. That 1902 trial and subsequent conviction marked the beginning of fingerprint evidence
in the courts of England. In October 1902, Alphonse Bertillon, made an individualization in
Paris, France, with fingerprints. As a result of the case, Bertillon is given credit for solving the
first murder in Europe with the use of only fingerprint evidence.
In 1903, after several months of fingerprinting criminals upon their release, Captain James H.
Parke of New York state developed the American Classification System. The use of the
American Classification System and subsequent fingerprinting of all criminals in the state of
New York was the first systematic use of fingerprinting for criminal record purposes in the
United States.
In 1914, Dr. Edmond Locard published “The Legal Evidence by the Fingerprints”.
Henry Jackson was the first criminal in the world who was caught in 1902 on the basis of
fingerprints alone.
In the 1980s, the Japanese National Police Agency came up with its first automated electronic
matching system called "Automated Fingerprint Identification Systems (AFIS)". AFIS collects
fingerprints through sensors, and then the computer identifies the ridge patterns and minutia
points (using Henry's system) from digital fingerprints before finding match results. It allowed
law enforcement agencies around the world to remotely and instantaneously validate millions
of fingerprints. To further strengthen the cooperation between law enforcement agencies, the
FBI launched Integrated AFIS (IAFIS) in 1999. IAFIS processed about 14.5 million fingerprint
documents during the year after its inception. It allowed IAFIS to register and preserve civilian
fingerprints to track information about the person's license, job, or social care schemes. One
out of every six individuals, on average, have their data saved on the FBI's database.
Word's most prominent civil applications, such as India's "Aadhar" program, the US Biometric
program, the UK border monitoring initiative, utilize millions of rolled or slap fingerprints.
New fingerprint recognition methods are under development to boost the efficiency of such big
data applications.
2.2 Types of fingerprint pattern
Fingerprint identification is one of the most important criminal investigation tools due to two
features: their persistence and their uniqueness. A person’s fingerprints do not change over
time. The friction ridges which create fingerprints are formed while inside their mother’s womb
and grow as the baby grows. The only way a fingerprint can change is through permanent
scarring, which doesn’t happen very often. It isn’t just that fingerprints don’t change. In
addition, fingerprints are unique to an individual. Even identical twins have different
fingerprints. Your fingerprints are yours and yours alone, and they’ll be that way for the rest
of your life.
As friction ridges spread out across the surface of the developing fingers, they form one of
three patterns: an arch, a loop, or a whorl. Each pattern type can be broken down into several
sub-patterns, which will be discussed in this chapter. The pattern formed is dependent on the
dimensions of the volar pad, its size, shape, and position on the finger. Pattern type is a function
of the volar pad’s 3D regression combined with the proliferation of friction ridges. As early as
1924, it was hypothesized that volar pad height and symmetry influence pattern formation.
“High,” symmetrical volar pads form whorls. Asymmetrical volar pads form loops. And “low”
volar pads form arches. While the minute details, or minutiae, within your fingerprint are
unique to you, there is evidence to suggest your fingerprint pattern is inherited. As with eye
color or hair color, your fingerprint patterns may appear similar to those of your mother or
father. Besides genetic factors, environmental factors also play a role in inheritance. It is more
likely you inherited your parents’ volar pad formations and rate of friction ridge development
than the actual patterns themselves. The patterns we see on our fingerprints display what we
will call ridge flow, which is an illustrative method of describing how the friction ridges form
patterns.
Most fingerprint pattern types have one or more of the following features formed as a result of
ridge flow: the core and delta. The core of a fingerprint, like the core of an apple, is the center
of the pattern. It is the focal point around which the ridges flow. The second feature of most
fingerprints is the delta. A delta is an area of friction ridge skin where ridge paths flowing in
three different directions create a triangular pattern. These patterns appear similar to lake or
river deltas: areas where the flow diverges.
(a) (b) (c)
Fig: The three basic fingerprint pattern types: (a) arches, (b) loops, and (c) whorls.
About65 percent of the total population has loops, 30 percent have whorls, and 5 percent have
arches. Arches have ridges that enter from one side of the fingerprint and leave from the other
side with a rise in the center. Whorls look like a bull’s-eye, with two deltas (triangles). Loops
enter from either the right or the left and exit from the same side they enter.
The core is the center of a loop or whorl. A triangular region located near a loop is called a
delta. Some of the ridge patterns near the delta will rise above and some will fall below this
triangular region. Sometimes the center of the delta may appear as a small island. A ridge count
is another characteristic used to distinguish one fingerprint from another. To take a ridge count,
an imaginary line is drawn from the center of the core to the edge of the delta.
Fig: The core of a loop pattern. Fig: The delta of a loop pattern.
ARCHES:
Arches are the least common fingerprint pattern. They are found in approximately 5% of
fingerprints in the general population. A fingerprint arch is similar to an architectural arch, or
a wave. The friction ridges enter one side of the fingerprint, make a rise in the center, and exit
out the other side of the print. There are no deltas in an arch pattern. The core is indistinct in
most arches.
Plain Arch
The Plain Arch is the simplest of all fingerprint patterns and is formed by ridges entering from
one side of the print and exiting on the opposite side. These ridges tend to rise in the center of
the pattern, forming a wave-like pattern.
Tented Arch
The Tented Arch is similar to the Plain Arch except that instead of rising smoothly at the center,
there is sharp up thrust or spike, or the ridges meet at an angle less than 90 degrees.
Fig: Tented Arch (T)
LOOPS:
The loop pattern is the most common fingerprint pattern found in the population.
Approximately 60%–70% of fingerprints in the population are loops. Loops are patterns in
which the ridges enter on one side of the finger, make a U-turn around a core, and exit out the
same side of the finger. Illustrates a variety of fingerprints with loop patterns. If you think of
the loop as a physical structure, you can imagine that water poured into the core will flow out
only at one side of the print. A loop must also have at least one intervening, looping ridge
between the delta and the core. This looping ridge is known as a recurve, which is another word
to describe a ridge that makes a U-turn.
Radial Loop
Radial loops are loops that are slanted toward the radius, the inner bone of the forearm. Ridges
flow in the direction of the thumb.
Ulnar Loop
Ulnar loops are loops that are slanted toward the ulna, the outer bone of the forearm. The ulna
is the bone associated with the elbow. These fingerprints flow toward the little finger of the
hand.
Fig: A plain whorl. A line from delta-to-delta cuts through several recurving ridges
Composite
Composite Pattern are subdivided into 4 distinct groups.
1. Central Pocket loop
2. Lateral Pocket Loop
3. Twinned Loop
4. Accidental.
3. Twined Loop
There are two distinct loops, one resting upon or encircling
the other and the ridges, containing the point of core have
their exit towards different deltas.
4. Accidental Loop
Accidental Whorls are very rare and unique and occur with a frequency of only one to three
percent. The fingerprint (bottom left) is an example of an Accidental Whorl because it does not
conform to any other definition, pattern or category type.
Fingerprint classification is the process of organizing large volumes of fingerprint cards into
smaller groups based on fingerprint patterns, ridge counts, and whorl tracings. When an
individual is arrested, it is important to search the files for a duplicate of that fingerprint record
to verify a recidivist’s (repeat offender’s) identity. In the mid-1800s, prior to the advent of
fingerprint records, individuals were photographed for rogues’ galleries, which were
collections of mug shots. Bertillonage was another classification system based on
anthropometric measurements, but it fell out of favour by the turn of the twentieth century.
Many individuals give false names when arrested, or change their appearance, so the fingerprint
record becomes the only reliable verification of their identity.
Henry Classification
Sir Edward Henry, Azizul Haque, and Chandra Bose developed the Henry Classification
System in 1897. The Henry system became the most widely used classification system in
English-speaking countries. Juan Vucetich also developed a classification system used in
Spanish-speaking countries. Prior to the advent of both the Vucetich and Henry systems,
Bertillon, Purkinje, Galton, and Faulds also worked on fingerprint classifications systems.
Classification systems have been modified and applied in countries such as Hungary, Portugal,
Prague, Germany, Japan, Spain, Holland, Italy, Russia, Mexico, Egypt, Norway, Cuba, Chile,
and France. Most of these systems involve analyzing the pattern types of the fingers and
assigning alphanumeric designations to each finger. In both the Henry and Vucetich systems,
the resulting classification resembles a fraction, with a numerator above a classification line
and a denominator below the classification line. There may be several sets of letters (both upper
and lower case) and numbers both above and below the classification line. There are six
components, or parts, to the Henry Classification System: the primary, secondary, sub
secondary, major, final, and key. This text will focus on examples of primary classification.
Primary classification assigns numerical value to only the whorl patterns present in the
fingerprint record. It is written as a fraction, but unlike a fraction, it is never reduced. One
number will appear in the numerator, and one number will appear in the denominator. The
fraction line is known as the classification line. Each finger is numbered from 1 to 10, starting
with the right thumb as finger number one, proceeding through the right index, right middle,
right ring, and right little fingers. The left thumb is finger number six, followed by the left
index, left middle, left ring, and left little fingers. The fingerprint card, also known as a tenprint
card, is numbered 1–10. The fingers are each assigned a point value if a whorl is found on that
finger. The point values decrease by half as you proceed through the remaining eight fingers.
For example, if there is a whorl located on the number one finger (the right thumb), it is
assigned a value of 16. If there is a whorl located on the number eight finger (the left middle
finger), it is assigned a value of two. The numerator is the sum of the point values for the even
numbered fingers plus one. The denominator is the sum of the point values for the odd
numbered fingers plus one.
Table: Finger Numbers and Point Values of Whorls in the Henry Classification System
Finger Finger Number Point Value of
a Whorl
Right 1 16
Thumb
Right Index 2 16
Right 3 8
Middle
Right Ring 4 8
Right Little 5 4
Left Thumb 6 4
Left Index 7 2
Left Middle 8 2
Left Ring 9 1
Left Little 10 1
Fig: The point values assigned to each finger using the Henry Classification System
The number one is added to both the top and bottom values in order to avoid a fraction that
reads 0/0. Therefore, if there are no whorls present in any of the 10 fingers, the primary
classification is 1/1 rather than 0/0. If every finger on the tenprint card is a whorl pattern, the
primary classification is 32/32. There are, in fact, 1024 possible variants of the primary
classification component of the formula.
NCIC Classification
The National Crime Information Center (NCIC), a division of the FBI’s Criminal Justice
Information Services (CJIS), is a national repository of computerized criminal justice
information. It was created in 1965 during the J. Edgar Hoover era and has since been expanded
and upgraded to its current incarnation, NCIC 2000, which launched in 1999. NCIC is
accessible by criminal justice agencies in all 50 states, Washington, DC, Puerto Rico, Guam,
the US Virgin Islands, and Canada. It includes 21 databases that include such information as
criminal records, missing persons, the sex offender registry, stolen property records, fugitive
records, orders of protection, and suspected terrorist activity. As of 2011, the FBI reported 11.7
million active records. The goal of NCIC is to provide an investigative tool not only to identify
property but also to protect law enforcement personnel and the public from individuals who
may be dangerous. NCIC also includes fingerprint classification information. It is important to
understand NCIC classification in order to decipher these codes if you work in any capacity
within the criminal justice system. The NCIC fingerprint classification system applies a 2-letter
code to each pattern type. The 2-letter codes for each of the 10 fingers are combined to form a
20-character classification. Each fingerprint’s code is listed in sequence, from the number 1
finger (right thumb) to the number 10 finger (left little finger). Ridge counts of loops and whorl
tracings are also included in the coding system to further classify pattern types. If an individual
has plain whorls with a meet tracing on fingers 2–10 and a double loop whorl with an outer
tracing on finger number one, the NCIC classification would read as follows:
dOPMPMPMPM
PMPMPMPMPM
Plain arch AA
Tented arch TT
Plain whorl, inner tracing PI
Plain whorl, outer tracing PO
The NCIC classification of an individual with plain whorls (outer tracing) on his or her thumbs
and plain arches on his or her remaining fingers would be
POAAAAAAAA
POAAAAAAAA
NCIC is unique in that it does not include the actual fingerprint images. It only includes the
classification described previously. A working knowledge of NCIC fingerprint classification
codes can give you a wealth of information regarding the fingerprint patterns of the individual
queried in an NCIC search, including a tentative identification of a person of interest, such as
a suspect or a missing person.
Fig: The three fingerprint scanners used in FVC2006 and an image collected through
each of them
live-scan scanner. The most important part of a fingerprint scanner is the sensor (or sensing
element), which is the component where the fingerprint image is formed. Almost all the
existing sensors belong to one of the three families: optical, solid-state, and ultrasound.
Optical sensors: Frustrated Total Internal Reflection (FTIR) is the oldest and most used live
scan acquisition technique. The finger touches the top side of a glass prism, but while the ridges
enter in contact with the prism surface, the valleys remain at a certain distance; the left side of
the prism is illuminated through a diffused light. The light entering the prism is reflected at the
valleys, and absorbed at the ridges. The lack of reflection allows the ridges to be discriminated
from the valleys. The light rays exit from the right side of the prism and are focused through a
lens onto a CCD or CMOS image sensor.
Solid-state sensors: Solid-state sensors (also known as silicon sensors) became commercially
available in the middle 1990s. All silicon-based sensors consist of an array of pixels, each pixel
being a tiny sensor itself. The user directly touches the surface of the silicon: neither optical
components nor external CCD/CMOS image sensors are needed. Four main effects have been
proposed to convert the physical information into electrical signals: capacitive, thermal, electric
field, and piezoelectric.
New sensing techniques such as multispectral imaging and 3D touch-less acquisition are being
developed to overcome some of the drawbacks of the current fingerprint scanners including: i)
the difficulty in working with wet or dry fingers, ii) the skin distortion caused by the pressure
of the finger against the scanner surface, and iii) the inability to detect fake fingers.
The quality of a fingerprint scanner, the size of its sensing area and the resolution can heavily
influence the performance of a fingerprint recognition algorithm. To maximize compatibility
between digital fingerprint images and ensure good quality of the acquired fingerprint
impressions, the US Criminal Justice Information Services released a set of specifications that
regulate the quality and format of both fingerprint images and FBI compliant off-line/live-scan
scanners. Unfortunately, the above specifications are targeted to the forensic applications
(AFIS sector) and as of today no definitive specifications exist for the evaluation/certification
of commercial fingerprint scanners.
At the local level, other important features, called minutiae can be found in the fingerprint
patterns. Minutia refers to the various ways in which the ridges can be discontinuous. For
example, a ridge can abruptly come to an end (termination), or can divide into two ridges
(bifurcation). Although several types of minutiae can be considered, usually only a coarse
classification (into these two types) is adopted to deal with the practical difficulty in
automatically discerning the different types with high accuracy.
2.3.2 Segmentation
The segmentation task consists in separating the fingerprint area from the background. Because
fingerprint images are striated patterns, using a global or local thresholding technique does not
allow the fingerprint area to be effectively isolated. Robust segmentation techniques are
discussed in.
Fig A fingerprint Gray-scale image; b) the image obtained after enhancement and
binarization; c) the image obtained after thinning; d) termination and bifurcation
minutiae detected through the pixel-wise computation of the crossing number
Some authors have proposed minutiae extraction approaches that work directly on the
grayscale images without binarization and thinning. This choice is motivated by the following
considerations: i) a significant amount of information may be lost during the binarization
process; ii) thinning may introduce a large number of spurious minutiae; iii) most of the
binarization techniques do not provide satisfactory results when applied to low-quality images.
Maio and Maltoni proposed a direct Gray-scale minutiae extraction technique, whose basic
idea is to track the ridge lines in the Gray-scale image, by `sailing' according to the local
orientation of the ridge pattern. A post-processing stage (called minutiae filtering) is often
useful in removing the spurious minutiae detected in highly corrupted regions or introduced by
previous processing steps (e.g., thinning) [16].
2.4 Matching
Matching high quality fingerprints with small intra-subject variations is not difficult and every
reasonable algorithm can do it with high accuracy. The real challenge is matching samples of
poor quality affected by: i) large dis-placement and/or rotation; ii) non-linear distortion; iii)
different pressure and skin condition; iv) feature extraction errors. The two pairs of images in
It is evident that fingerprint images from different fingers may sometimes appear quite similar
(small inter-subject variations).
Fig: a) Each row shows a pair of impressions of the same finger, taken from the
FVC2002 DB1, which were falsely non-matched by most of the algorithms submitted to
FVC2002 [32]; b) each row shows a pair of impressions of different fingers, taken from
the FVC2002 databases which were falsely matched by some of the algorithms
submitted to FVC2002
The large number of existing approaches to fingerprint matching can be coarsely classified into
three families: i) correlation-based matching, ii) minutiae-based matching, and iii) ridge
feature-based matching. In the rest of this section, the representation of the fingerprint acquired
during enrolment is denoted as the template (T) and the representation of the fingerprint to be
matched is denoted as the input (I). In case no feature extraction is performed, the fingerprint
representation coincides with the grayscale fingerprint image itself.
7.2 Livescan
In order for the AFIS to perform its duties, the fingerprints must first be captured and converted
to a language the computer can read. If an inked tenprint card is available, that card can be
scanned into the AFIS computer on a flatbed scanner. A livescan device can also be used to
capture the necessary fingerprint images digitally. The livescan is similar to a document
scanner in that it creates a digitized replica of the fingerprint. The fingerprint will appear on
the screen in black and white and will appear similar to a tenprint card, with black ridges on a
white background. The fingerprints are taken using the same process for inked tenprint cards.
Each finger is rolled, starting with the number one finger, from nail to nail. Then the flats and
thumbs are recorded.
Fig: A fingerprint scanned at a livescan terminal appears on the screen of the computer
monitor
Many law enforcement agencies have livescan terminals. These may simply consist of a
scanner for recording fingerprint images, a computer monitor to view the images, and a
keyboard for recording demographic and criminal information. There are many vendors who
build and maintain different types of terminals. While they look different, most of these
terminals function similarly.
When the citizens of San Francisco were asked whether they wanted a fingerprint computer,
they approved the ballot proposition with an 80% majority. An AFIS system was installed in
1983 and a new unit, called Crime Scene Investigations, was formed. Crime scene investigators
were trained in the use of the system and searched their own cases when they recovered
fingerprints from crime scenes. Over the next 2 years, the “San Francisco experiment” resulted
in a dramatic increase in fingerprint identifications. Ten times as many fingerprints found at
crime scenes were identified to suspects over 2 years. The burglary rate decreased by 26% over
the next 4 years. The experiment was a great success that resulted in the proliferation of AFIS
systems in law enforcement agencies nationwide and around the world. In 1992, IAFIS
imported 32 million fingerprint cards into its database. In 1997, the United Kingdom installed
its own national AFIS database. Thirty years after the advent of AFIS, most law enforcement
agencies use the system routinely, and few fingerprint examiners are trained to classify prints
using the Henry system.
7.4 Tenprint Searches
One of the main functions of AFIS is to store, classify, and search tenprint records from arrests.
When an individual is arrested, his fingerprints are taken either in ink on a tenprint card or by
a livescan. The suspect’s palm prints and mug shot may also be taken at this time. Demographic
and other identifying information are recorded. Descriptions of other individualizing marks
such as scars, marks, and tattoos may also be recorded and/or photographed. The fingerprints
will then be classified, or coded, by the computer and searched in the database. If the computer
locates a candidate file with matching fingerprints, a tenprint examiner will confirm the match.
A tenprint examiner is an individual whose job is to confirm tenprint matches by the AFIS
computer and manage the database.
One of the most important aspects of this function is the coding of the fingerprint by the
computer. The fingerprint is coded when the computer picks out the minutiae in the fingerprint.
This is known as feature extraction. The AFIS computer is programmed with an algorithm that
recognizes minutiae. It can also recognize the direction of ridge flow, the distance between
ridges, and how many ridges are between minutiae. The computer searches through the
database and returns candidates that have similar features in similar positions. There can be
100 or more minutiae in a rolled fingerprint. In a latent print, however, there are far fewer
minutiae since it is deposited unintentionally on a variety of substrates. A latent fingerprint
may be a fingertip, delta, palm print fragment, or an entire hand.
Fig: This latent lift was scanned into AFIS and identified to a suspect. The identification
is shown with (bottom) and without (top) minutiae selected
computer will return a list of candidates. A candidate is an individual whose exemplar
fingerprint is recognized by the computer as having a similar arrangement of minutiae as the
latent fingerprint. The latent print examiner then compares the latent print to the known prints
of each candidate. If a match is found, it is recorded, and another latent print examiner verifies
the match. It is important to know that the latent print examiner, not the computer, decides
whether an AFIS record matches an unknown fingerprint from a crime scene.
If the latent fingerprint does not match any records in the AFIS database, the latent print
examiner can then choose to search it against the UL database. This database stores all
unidentified latent fingerprints entered into the database. While this may not result in
identification to a suspect, it may lead to information linking several crime scenes. If the latent
print does not match another unidentified latent print, it can be saved, along with the
accompanying case data, into the UL database for future searches. Every new tenprint card
entered into AFIS is searched against the UL database. If the computer finds a match, the case
will be forwarded to a latent print examiner who will verify the identification. Approximately
10%–15% of all cases and 2%–3% of latent print searches result in a fingerprint match.
3
3.1 Historical Background of Ear Biometrics
Ear is an important component of the face. As early as 14th century B.C., artists have noted that
its morphology and relative position on the head have remained more or less constant throughout
time. Although various studies concerning anatomy and growth of external ear were carried out
by plastic surgeons and forensic scientists in different parts of the world, not match with the
anthropological attention that has been paid to the external ear. A brief review of the work has
already been done by various scientists all over the world on external ear. Some of the recent
studies on various aspects of sex, bilateral and ethnic determination are discussed here as follows:
Schwalbe (1875) described a total of 19 external ear landmarks, including general variations such
as ear inclination and protrusion, as well as ear measurements such as height and width of ear. In
studying various populations he recorded these features as well as general information such as
Iannarelli (1989) examined over 10,000 ears and no indistinguishable ears were found. He had
created a 12 measurement “Iannarelli System”, used the right ear of people, specially aligned and
normalized the photographs. To normalize the pictures, they were enlarged until they fit to the
predefined easel. After that the measurements were taken directly from the photographs. The
distance between each of the numbered areas was measured and assigned an integer distance
value. The identification consists of the 12 measurements and the information about sex and race.
Carreira-Perpinan (1995) proposed outer ear (pinna) images for human recognition, based on
their discriminant capacity (only outperformed by fingerprints) and their small area and
variability (compared to face images). Compression neural networks were used to reduce the
dimensionality of the images and produce a feature vector of manageable size. These networks
are 2-layer linear perceptrons, trained in a self-supervised way (he used back propagation and
quick prop). A theoretical justification of the training process was given in terms of principal
components analysis. The approach was compared with standard numerical techniques forsingular
value decomposition. A simple rejection rule, based on the reconstruction error, for recognition
was proposed. Additional comments about the robustness of the network to image transformations
Burge and Burger (1998) introduced a class of biometrics based upon ear features, which were
used in the development of passive identification systems. The viability of the proposed biometric
was shown both theoretically in terms of the uniqueness and measurability over time of the ear, and
in practice through the implementation of a computer vision based system. Each subject‟s ear was
modeled as an adjacency graph built from the Voronoi diagram of its curve segments. He introduced
a novel graph matching based algorithm for authentication which takes into account the erroneous
curve segments which can occur due to changes (e.g., lighting, shadowing, and occlusion) in the
ear image.
Moreno et. al. (1999) described three neural net approaches for the recognition of 2D intensity
images of the ear. Their testing used a gallery of 28 persons plus another 20 persons not in the
gallery. They were found a recognition rate of 93% for the best of the three approaches method.
Hoogstrate et. al. (2000) discussed the ability to identify persons by ear from surveillance
They presented the results of a test, which were constructed to investigate whether the participants
could individualize suspects by ear in a closed set situation. In general the possibility to identify a
person by smaller body parts, in particular his/her ear, from surveillance videotape might become
a useful tool as the availability of images from surveillance cameras was currently rapidly
increasing. It was shown that the quality of the video images determines to a large extent the ability
collectable. However in practice, a characteristic that satisfies all these requirements may not be
suitable for a biometric system. In biometric systems there are more requirements, e.g.
performance, acceptability and circumvention. Performance means system‟s accuracy and speed.
If the system was too slow and it makes too many mistakes, the system won‟t be used. Acceptability
was important: if the people don‟t accept the systems as a part of their daily routines, the system
won‟t be used. Circumvention was that how easy it was to fool the system. That rate should be very
Brucker et. al. (2003) explored anatomic and aesthetic differences in the ear between men and
women, as well as changes in ear morphology with age. A total of 123 volunteers were randomly
selected for this study. The cohort consisted of 89 women ages 19 to 65 years (median age, 42
years) and 34 men ages 18 to 61 years (median age, 35 years). The average total ear height across
the entire cohort for both left and right ears was 6.30 cm, average lobular height was 1.88 cm, and
average lobular width was 1.96 cm. Based on head size, significant sex-related differences were
noted in the distance from the lateral palpebral commissure to both the helical root and insertionof
the lobule. Measured distances in both vectors were approximately 4.6 percent longer in men than
in women. Similarly, the height of the pinna was significantly larger in men than in women by
approximately 6.5 percent. The average height and width of the lobule, however, were nearly
identical in men and women. Analysis of age-related data showed a significant difference in the
total ear height between the subpopulations; however, this difference was not significant after the
lobular height was subtracted from total ear height, suggesting that the lobule was the only ear
structure that changed significantly with age. In addition, lobular width decreased significantly with
age. This study established normative data for ear morphology and clearly demonstrated the
recognition. Though their previous experiments with ear and face recognition using the standard
Principal Component Analysis approach showed lower recognition performance using ear images
and later they reported results of similar experiments on larger data sets that were more rigorously
controlled for relative quality of face and ear images. They found that recognition performance was
not significantly different between the face and the ear; for example, 69.3% versus 72.7%,
respectively, in one experiment. They also found that multi-modal recognition using both the ear
and face resulted in statistically significant improvement over either individual biometric; for
example, 90.9% in the analogous experiment. Kearney (2003) examined the external ear for
identification. He studied, both left and right ears of 153 subjects, 55 male, 98 female were
photographed and ear variations were documented. Though their previous description of the
different parts of the ear, known as ear landmarks, which were used to categorize ear types.
Comparisons were made between the left and right ears, males and females, according to age. It
was found that sexual dimorphism exists mostly in ears size rather than form and that the real
changes in size with age. The categories of ear types used were not precise in a successfully
distinguish all 301 ears included in this study from one another; however no two ears were found
to be exactly alike.
Chen and Bhanu (2004) explained that ear detection was an important part of an ear recognition
system. They introduced a simple and effective method to detect ears, which had two stages: offline
model template building and on-line detection. The model template was represented by an averaged
histogram of shape index. The on-line detection was a four-step process: step edge detection and
thresholding, image dilation, connect component labeling and template matching. Experiment
results with real ear images were presented to demonstrate the effectiveness of their approach.
Hurley et. al. (2004) defined feature space to reduce the dimensionality of the original pattern
space, whilst maintaining discriminatory power for classification. To meet this objective in the
context of ear biometrics a new force field transformation treats the image as an array of mutually
attracting particles that act as the source of a Gaussian force field. Underlying the force field there
was a scalar potential energy field, which in the case of an ear took the form of a smooth surface
that resembles a small mountain with a number of peaks joined by ridges. The peaks correspond
to potential energy wells and to extend the analogy the ridges correspond to potential energy
channels. Since the transform also turns out to be invertible, and since the surface was otherwise
smooth, information theory suggested that much of the information was transferred to these
features, thus confirming their efficacy. They previously described how field line feature extracted,
using an algorithm similar to gradient descent, exploits the directional properties of the force field
to automatically locate these channels and wells, which then form the basis of characteristic ear
features. They had shown how an analysis of the mechanism of this algorithmic approach lead to a
closed analytical description based on the divergence of force direction, which revealed that
channels and wells were really manifestations of the same phenomenon. Further it showed that this
new operator, with its own distinct advantages, had a striking similarity to the Marr-Hildreth
operator, but with the important difference that it was non-linear. As well as addressing faster
implementation and brightness sensitivity, the technique was also validated by performing
recognition on a database of ears selected from the XM2VTS face database, and by comparing the
results with the more established technique of Principal Components Analysis. That confirmed not
only that ears do indeed appear to have potential as a biometric, but also that the new approach was
well suited to their description, being robust especially in the presence of noise, and having the
advantage that the ear does not need to be explicitly extracted from the background.
Alvarez et. al. (2005) recognized a subject‟s ear, they aimed to extract a characteristic
vector from a human ear image that may subsequently be used to identify or confirm
the identity of the owner. Towards that end, a new technique, combining geodesic
active contours and a new ovoid model, had been developed, which can be used to
Chen et. al. (2005) explained that ear recognition approaches do not give theoretical
recognize human ears in 3D. Comparing local surface descriptors between a test and a
model image, an initial correspondence of local surface patches was established and
then filtered using simple geometric constraints. The performance of the proposed ear
(b) A binomial model was also presented to predict the ear recognition performance.
efficient, more natural and easy for users than traditional methods of human
identification. In fact, only biometrics methods truly identify humans, not keys and
cards they possess or passwords they should remember. The future of biometrics will
surely lead to systems based on image analysis as the data acquisition is very simple
and requires only cameras, scanners or sensors. More importantly such methods could
be passive, which means that the user does not have to take active part in the whole
process or, in fact, would not even know that the process of identification takes place.
There are many possible data sources for human identification systems, but the
human behaviour. The most interesting human anatomical parts for such passive,
physiological biometrics systems based on images acquired from cameras were face
and ear. Both of those methods contain large volume of unique features that allow to
distinctively identify many users and will be surely implemented into efficient
biometrics systems for many applications. They introduced to ear biometrics and
presented its advantages over face biometrics in passive human identification systems.
Then the geometrical method of feature extraction from human ear images in order to
Dewi and Yahagi (2006) proposed ear photo recognition using scale invariant key
points. The key points were extracted by performing Scale Invariant Feature Transform
(SIFT). In their experiments, SIFT generates approximately 16 key points for each ear
image. After they extracted the key points, they classified the owner of an ear by
calculating the number of key point matches and the average distance of the closest
square distance. They compared their results with ear photo recognition using PCA
and ear photo recognition using force field feature extraction. Their experimental
results showed that ear recognition using SIFT gives the best recognition result.
Jeges and Mate (2006) introduced a model based scheme for ear feature extraction,
implementation of which had proved that the method was strong enough to be
Ali et. al. (2007) described ear as a new comer in biometric recognition techniques.
Various methods have been employed for ear recognition to improve the performance
and making the results comparable with other existing methods. In continuation to
these efforts, a new ear recognition method was proposed. Ear images were cropped
manually from the side head images. After that wavelet transform was used for feature
extraction and matching was carried out using Euclidean distance. Results achieved by
over face and fingerprint which are the two most common biometrics in both academic
research and industrial applications. An ear can be imaged in 3D and surface shape
information related to its anatomical structure can be obtained. This made it possible
Chen and Bhanu (2007) proposed a complete human recognition system using 3D
ear biometrics. The system consists of 3D ear detection, 3D ear identification, and 3D
ear verification. For ear detection, they proposed a new approach which used a single
reference 3D ear shape model and locates the ear helix and the antihelix parts in
registered 2D color and 3D range images. For ear identification and verification using
range images, two new representations were proposed. Those included the ear
helix/antihelix representation obtained from the detection algorithm and the local
The 2D histogram showed the frequency of occurrence of shape index values versus
the angles between the normal of reference feature point and that of its neighbors. Both
shape representations were used to estimate the initial rigid transformation between a
gallery-probe pair. This transformation was applied to selected locations of ears in the
gallery set and a modified Iterative Closest Point (ICP) algorithm was used to
iteratively refine the transformation to bring the gallery ear and probe ear into the best
alignment in the sense of the least root mean square error. The experimental results on
the UCR data set of 155 subjects with 902 images under pose variations and the
University of Notre Dame data set of 302 subjects with time-lapse gallery- probe pairs
Bustard and Nixon (2008) described a new technique which improves the robustness
clutter and occlusion. By treating the ear as a planar surface and creating a
matched, ears can be registered accurately. The feature matched reduce the gallery size
and enabled a precise ranking using a simple 2D distance algorithm. When applied to
the XM2VTS database it gave results comparable to PCA with manual registration.
to background clutter, viewing angles up to ±13 degrees and with over 20% occlusion.
topographic labels. The proposed algorithm contains four stages. First topographic
labels from the ear image. Then using the map of regions for three topographic labels
namely, ridge, convex hill and convex saddle hill they build a composed set of labels.
The thresholding on this labeled image provides a connected component with the
maximum number of pixels which represents the outer boundary of the ear. As well
“USTB” database which contains 308 profile view images of the ear and its
surrounding backgrounds.
Abaza and Ross (2010) analyzed the symmetry of human ears in order to understand
portions of the ear that may be occluded in a surveillance video. Ear symmetry was
assessed geometrically using symmetry operators and Iannarelli‟s measurements,
where the contribution of individual ear regions to the overall symmetry of the ear was
studied. Next, to assess the ear symmetry (or asymmetry) from a biometric recognition
system perspective, several experiments were conducted on the WVU Ear Database.
The experiments suggested the existence of some degree ofsymmetry in the human
ears that can perhaps be systematically exploited in the design of commercial ear
recognition systems. At the same time, the degree of asymmetry it offered may beused
in designing effective fusion schemes that combine the face information with the two
ears.
Prakash and Gupta (2012) proposed an efficient technique for automatic localization
of ear from side face images. The technique was rotation, scale and shape invariant and
made use of the connected components in a graph obtained from the edge map of the
side face image. It has been evaluated at IIT Kanpur database consisting of 2672 side
faces with variable sizes, rotations and shapes and University of Notre Dame database
containing 2244 side faces with variable background and poor illumination.
The human ear is the organ of hearing and equilibrium. It detects and analyzes sound by the
mechanism of transduction, which is the process of converting sound waves into
electrochemical impulses. Audition cannot take place adequately if the anatomy is abnormal.
This article will discuss the mechanisms implied in the conduction of sound waves into the
ear, and its integration and transmission from the middle ear and inner ear to the brain.
The human ear is a rudimentary shell-like structure that lies on the lateral aspect of the head.
The ear is a cartilaginous structure. For physiological study purposes, it subdivides into three
fundamental substructures: the external ear, the middle ear, and the inner ear.
The outer ear, also called auricle, is composed of cartilage and it is the part with the
most contact with the external world. It has various anatomical demarcations like the helix,
the antihelix, the tragus, and the antitragus and these demarcations lead to a depression called
acoustic meatus. This meatus has a tube form and extends inward to end in the tympanic
membrane. Two-thirds of this canal are cartilaginous, and the last third is bone, and the two
external thirds have a lining with oil glands that produce cerumen to keep the canal clean from
insects and other objects. At the end of the outer ear, lies the middle ear, which is limited
externally by the tympanic membrane and internally by the oval window.
The middle ear is an air-filled space. It divides into an upper and a lower chamber,
the epitympanic chamber (attic) and the tympanic chamber (atrium), respectively. It is like a
room because it has a rectangular-like shape. It has anatomical relations with the jugular vein,
the carotid artery, the inner ear, the eustachian tube, and the mastoid. The content of this room
consists of ossicles; the malleus, the incus and the stapes, namely. These bony structures are
suspended by ligaments which make them suitable for transmission of vibrations into the inner
ear. The vibrations that come into this part of the middle ear than get transmitted by the action
of the stapes, into the inner ear.
The inner ear is a space composed of the bony labyrinth and the membranous
labyrinth, one inside the other. The bony labyrinth has a cavity filled with semicircular canals
that are in charge of sensing equilibrium; this cavity is called the vestibule and is the place
where the vestibular part of the VIII cranial nerve forms. The cochlea is the organ of hearing.
It takes its name from the Greek language that means the shell of a snail and is the part from
where the cochlear part of the VIII cranial nerve forms, thus constituting the vestibulocochlear
nerve.
The hearing is the process by which sound vibrations transform from the external environment
into action potentials. Vibrating objects produce sounds, like the strings of a guitar, and this
vibrations pressure pulses into air molecules, better known as sound waves. So, the ear is
equipped to distinguish different characteristics of sound, such as pitch and loudness; which
refers to the frequency of sound waves and the perception of the intensity of sound,
respectively. Frequency measurement is in hertz (Hz, cycles per second). The human ear can
detect frequencies from 1000 to 4000 hertz, but a young ear can hear frequencies in the range
between 20 and 20000 hertz. The intensity of sound is measured in decibels (dB); the range
of human hearing on a decibel scale is from 0 to 130 dB (where the sound becomes painful).
All these physical properties have to incur transformations to get into the central nervous
system. The first transformation consists of the conversion of air vibrations into tympanic
membrane vibrations. These vibrations then get transmitted into the middle ear and the
ossicles. Then these vibrations transform into liquid vibrations in the inner ear and the cochlea,
and these stimulate a region called the basilar membrane and the organ of Corti. Finally, these
vibrations get transformed into nerve impulses, which travel to the nervous system.
positioned approximately halfway between the top of the head and the chin on the vertical
axis, and the eye and the nose of the face on the horizontal axis. The external ear varies from
the most minutiae to the greatest degree in reference to size, shape, design and anatomical
distance between various different features like helix rim, anti-helix, tragus, antitragus,
triangular fossa, crus of helix, concha, Incisure Intertragica, lobule and overall size (length
and width) of ear etc. These features are present in all people and provide variances according
to an ear can set apart from the others.
SHAPES OF EXTERNAL EAR: The long-held history of the use of ear shapes suggests its
use for automatic human identification. There are four types of shapes of external ear.
a) Oval
b) Triangular
c) Rectangular
d) Round
Fig. Ear shapes: (a) Oval (b) Triangular, (c) Rectangular (d) Round
The anatomy of the external ear depicting the individual components can be seen in following
figure:
Helix Rim1a-1d: Helix rim is the outer frame of the auricle, it is a rolled up edge.
Lobule2: The fleshy lower portion of the ear.
Antihelix3: Elevated ridge of cartilage between the concha and the scapha is called antihelix.
Itis a folded "Y" shaped part of the ear.
Concha4: The hollow bowl like portion of the outer ear next to the canal. An enlarged concha
forces the outer ear away from the scalp.
Tragus5: The small projection just in front of the ear canal is called tragus.
Antitragus6: The lower cartilaginous edge of the conchal bowl just above the fleshy lobule of
the ear.
Crus of helix7: A landmark of the outer ear.
Fig. Anatomy of the external ear: (1) Helix Rim, (2) Lobule, (3) Antihelix, (4) Concha, (5)
Tragus, (6) Antitragus, (7) Crus of Helix, (8) Triangular Fossa and (9) Incisure Intertragica
At present, three methods are in use to collect ear samples for identification:
1. PHOTO COMPARISON
Taking photo of the ear is the most commonly used method in research. The most interesting
parts of the ear are the external ear and ear lobe, but the whole ear structure and shape can be
used. The photo can be taken and then combined with previous taken photos to identify an
individual. Alfred Iannarelli has made two large-scale ear identification studies in 1989. In the
first study, there were over 10,000 ears drawn from randomly selected samples in California.
The second study was for researching identical and non-identical twins. These cases support
the hypothesis about ear‟s uniqueness. Even the identical twins had similar, but not identical,
ear physiological features.
1a 1
2
8 1b
3 6
8 7
12 3
7 3 1c
8
5
4 11
1d 9
9 4
6 10
2 5
a b
Fig. (a) Anatomy of ear, (b) Measurements (a) 1 Helix Rim, 2 Lobule, 3 Antihelix, 4 Concha,
5 Tragus, 6 Antitragus, 7 Crus of Helix, 8 Triangular Fossa, 9 Incisure Intertragica.
(b) The locations of the anthropometric measurements used in the “Iannarelli System”.
2. EARMARKS (EARPRINTS)
Ear identification can be done from photographs or from video. There is another possibility
that ear can be pressed against some material, e.g. glass, and the „earmark‟ can be used further
for biometrics study. Ear prints have been used in forensic investigation since the mid 1960s.
Currently, it is estimated that in the Netherlands alone, ear mark evidence could be used in
approximately 50,000 cases of burglary per year.
An earprint is a two-dimensional reproduction of parts of the ear. Anatomical features
frequently found in a print are the helix, antihelix, tragus and antitragus and the transfer of
unique features onto a surface can be used for identification, for e.g. an ear fold, wrinkle
spotor mole. Ear prints can be lifted from crime scenes similar to the way fingerprints and
comparative analysis may then be performed using the crime scene mark and a control print
(from a suspect). The print can be lifted, photographed, analyzed and compared against the
earprints of several suspects. Based on the unique formation of the print and the ear features
transferred, a suspect may be positively identified.
In case the ear is partially occluded by hair, the hair can be masked out of the image by
using thermogram pictures. In the thermogram pictures different colors and textures can be
used to find different parts of ear. The ear is quite easy to detect and localizable using
thermogram imagery by searching high temperature areas.
Without a doubt, the use of Ear biometrics in private industry as well as government is real
and is here to stay. The security industry, law enforcement agencies, and other government
agencies where security is vital are constantly developing new ways of using biometrics to
help identify and monitor criminals and terrorists, and to secure access to sensitive data. In
addition to its current uses, it will not be long until biometrics finds its way into many
commercial applications such as e-commerce, automobiles, and cell phones. Many
companies have already started examining how biometrics and their applications will aid
their business and are planning to implement it sometime in the future.
In the immediate future thumb, hand and ear prints will remain the primary way in which
people are identified through biometrics. This is because many government agencies such
as the FBI, DEA, and INS already have large scale finger print databases. Even though it
is primarily government agencies that currently use finger prints to identify people, the
private industry will be next to use finger print technology to identify consumers because
of its low cost to maintain and its advanced development.
This type of system would greatly enhance the effectiveness of our nation‟s police
departmentsby freeing them from having to patrol as often and from having to search for
criminals. This type of application of biometrics is very distant, however, because the
technology for this is still developing, however the cost of implementing this technology
is very low among the othertechniques, and many people still have major privacy concerns.
Identity is important when it is weak. This apparent paradox is the core of the current study.
Traditionally, verification of identity has been based upon authentication of attributed and
biographical characteristics. After small scale societies and large scale industrial societies,
globalization represents the third period of personal identification. The human body lies at
the heart of all strategies for identity management. Therefore, to resolve this issue
application of BIOMETRICS is one of the most happening solutions. In particular,
biometrics requires criteria for identifying individuals in different contexts, under different
descriptions and at different times. In the course of modern history, personal identities have
been based on highly diverse notions such as religion, rank, class, estate, gender, ethnicity,
race, nationality, politics, virtue, honor, erudition, rationality, and civility. Many security,
investigation and health care organizations are in progress to deploy biometrics security
architecture. Secure identification is critical in the security and health care system, both to
control logic access to centralized archives of digitized individuals‟ data, and to limit
physical access to buildings and hospital wards, and to authenticate medical and social
support personnel. There is also an increasing need to identify criminals with a high degree
of certainty. All these issues require a careful ethical and political scrutiny. The general
problem of identity and the specific problem of personal identity do not admit easy
solutions. Yet we need some criteria to establish identities, either in the sense of qualitative
identities or in the sense of numerical identities. These criteria are chosen categories.
Consequently, in the present research work in lieu with Digital Image Processing
researcher aims to apply Ear Biometrics techniques to differentiate among Yadav and
Brahmin communities of Bundelkhand region on the basis of sex, bilateral and ethnic
differences.
UNIT
4
DNA Forensics
In the past few years, the general public has become more familiar with the power of DNA
typing as the media has covered efforts to identify remains from victims of the World Trade
Center Twin Towers collapse following the terrorist attacks of 11 September 2001, the O.J.
Simpson murder trial in 1994 and 1995, the parentage testing of Anna-Nicole Smith’s daughter
in 2007, and the ongoing Innocence Project that has led to the exoneration of over 200
wrongfully convicted individuals. News stories featuring the value of forensic DNA analysis
in solving crime seem commonplace today. In another popular application, DNA testing with
Y chromosome markers is now used routinely to aid genealogical investigations. In addition,
the medical community is poised to benefit from the tremendous amount of genomic DNA
sequence information being generated. DNA testing has an important role in our society that is
likely to grow in significance and scope in the future.
Though high-profile cases have certainly attracted widespread media attention in recent years,
they are only a small fraction of the thousands of forensic DNA and paternity cases that are
conducted each year by public and private laboratories around the world. The technology for
performing these tests has evolved rapidly over the past two decades to the point where it is
now possible to obtain results in a few hours on samples with only the smallest amount of
biological material. DNA typing, since it was introduced in the mid-1980s, has revolutionized
forensic science and the ability of law enforcement to match perpetrators with crime scenes.
Thousands of cases have been closed with guilty suspects punished and innocent ones freed
because of the power of a silent biological witness at the crime scene. Deoxyribonucleic acid
(DNA) is probably the most reliable biometrics. It is in fact a one-dimensional code unique for
each person. Since the mid-1990s, computer databases containing DNA profiles from crime
scene samples, convicted offenders, and in some cases, persons simply arrested for a crime,
have provided law enforcement with the ability to link offenders to their crimes. Application
of this technology has enabled tens of thousands of crimes particularly horrible serial crimes
by repeat offenders to be solved around the world.
‘DNA fingerprinting’, or DNA typing (profiling) as it is now known, was first described in
1985 by an English geneticist named Alec Jeffreys. Dr. Jeffreys found that certain regions of
DNA contained DNA sequences that were repeated over and over again next to each other. He
also discovered that the number of repeated sections present in a sample could differ from
individual to individual. By developing a technique to examine the length variation of these
DNA repeat sequences, Dr. Jeffreys created the ability to perform human identity tests. These
DNA repeat regions became known as VNTRs, which stands for variable number of tandem
repeats. The technique used by Dr. Jeffreys to examine the VNTRs was called restriction
fragment length polymorphism (RFLP) because it involved the use of a restriction enzyme to
cut the regions of DNA surrounding the VNTRs. This RFLP method was first used to help in
an English immigration case and shortly thereafter to solve a double homicide case. Since that
time, human identity testing using DNA typing methods has been widespread. The past two-
and-a half decades have seen tremendous growth in the use of DNA evidence in crime scene
investigations as well as paternity and genetic genealogy testing.
1986 DNA testing goes public with Cellmark and Life codes in United
States
1988 FBI begins DNA casework with single-locus RFLP probes
1989 TWGDAM established; NY v. Castro case raises issues over quality
assurance of laboratories
1990 Population statistics used with RFLP methods are questioned; PCR
methods start with DQA1
1991 Fluorescent STR markers first described; Chelex extraction
1994 Congress authorizes money for upgrading state forensic labs; ‘DNA
wars’ declared over; FBI starts casework with PCR-PM
1995 O.J. Simpson saga makes public more aware of DNA; DNA
Advisory Board setup; UK DNA Database established; FBI starts
using D1S80/amelogenin
1996 NRC II Report; FBI starts mtDNA testing; first multiplex STR kits
become available
1997 Thirteen core STR loci defined; Y-chromosome STRs described
1998 FBI launches national Combined DNA Index System; Thomas
Jefferson and Bill Clinton implicated with DNA
1999 Multiplex STR kits are validated in numerous labs; FBI stops testing
DQA1/PM/D1S80
2000 FBI and other labs stop running RFLP cases and convert to multiplex
STRs; PowerPlex 16 kit enables first single amplification of CODIS
STRs
2001 Identifiler STR kit released with 5-dye chemistry; first Y-STR kit
becomes available
2002 FBI mtDNA population database released; Y-STR 20plex published
The basic unit of life is the cell, which is a miniature factory producing the raw materials,
energy, and waste removal capabilities necessary to sustain life. Thousands of different
proteins are required to keep these cellular factories operational. An average human being is
composed of approximately 100 trillion cells, all of which originated from a single cell (the
zygote) formed through the union of a father’s sperm and a mother’s egg. Each cell contains
the same genetic programming. Within the nucleus of our cells is a chemical substance known
as DNA that contains the informational code for replicating the cell and constructing the needed
proteins. Because the DNA resides in the nucleus of the cell, it is often referred to as nuclear
DNA. Some minor extranuclear DNA, known as mitochondrial DNA, exists in human
mitochondria, which are the cellular powerhouses. Deoxyribonucleic acid, or DNA, is
sometimes referred to as our genetic blueprint because it stores the information necessary for
passing down genetic attributes to future generations. Residing in every nucleated cell of our
bodies (note that red blood cells lack nuclei), DNA provides a ‘computer program’ that
determines our physical features and many other attributes. The complete set of instructions
for making an organism, that is, the entire DNA in a cell, is referred to collectively as its
genome.
DNA molecules store information in much the same way that text on a page conveys
information through the order of letters, words, and paragraphs. Information in DNA is stored
based on the order of nucleotides, genes, and chromosomes. DNA has two primary purposes:
(1) to make copies of itself so cells can divide and carry the same information; and (2) to carry
instructions on how to make proteins so cells can build and maintain the machinery of life.
Information encoded within the DNA structure itself is passed on from generation to generation
with one-half of a person’s DNA information coming from his or her mother and one-half
coming from his or her father.
Nucleic acids including DNA are composed of nucleotide units that are made up of three parts:
a nucleobase, a sugar, and a phosphate. The nucleobase or ‘base’ imparts the variation in each
nucleotide unit, while the phosphate and sugar portions form the backbone structure of the
DNA molecule. The DNA alphabet is composed of only four characters representing the four
nucleobases: A (adenine), T (thymine), C (cytosine), and G (guanine). The various
combinations of these four letters, known as nucleotides or bases, yield the diverse biological
differences among human beings and all living creatures. Humans have approximately 3 billion
nucleotide positions in their genomic DNA. Thus, with four possibilities (A, T, C, or G) at each
position, literally zillions of combinations are possible. The informational content of DNA is
encoded in the order (sequence) of the bases just as computers store binary information in a
string of ones and zeros.
Directionality is provided when listing a DNA sequence by designating the ‘five-prime’ (5’)
end and the ‘three-prime’ (3’) end. This numbering scheme comes from the chemical structure
of DNA and refers to the position of carbon atoms in the sugar ring of the DNA backbone
structure. A sequence is normally written (and read) from 5’ to 3’ unless otherwise stated. DNA
polymerases, the enzymes that copy DNA, only ‘write’ DNA sequence information from 5’ to
3’, much like the words and sentences in this book are read from left to right.
In its natural state in the cell, DNA is actually composed of two strands that are linked together
through a process known as hybridization. Individual nucleotides pair up with their
‘complementary base’ through hydrogen bonds that form between the bases. The base-pairing
rules are such that adenine can only hybridize to thymine and cytosine can only hybridize to
guanine. There are two hydrogen bonds between the adenine – thymine base pair and three
hydrogen bonds between the guanine – cytosine base pair. Thus, GC base pairs are stuck
together a little stronger than AT base pairs. The two DNA strands form a twisted ladder shape
or double helix due to this ‘base-pairing’ phenomenon.
The two strands of DNA are ‘anti-parallel’; that is, one strand is in the 5’ to 3’ orientation and
the other strand lines up in the 3’ to 5’ direction relative to the first strand. By knowing the
sequence of one DNA strand, its complementary sequence can easily be determined based on
the base-pairing rules of A with T and G with C. These combinations are sometimes referred
to as Watson – Crick base pairs for James Watson and Francis Crick who discovered this
structural relationship in 1953.
Hybridization of the two strands is a fundamental property of DNA. However, the hydrogen
bonds holding the two strands of DNA together through base pairing may be broken by elevated
temperature or by chemical treatment, a process known as denaturation. A common method
for denaturing double stranded DNA is to heat it to near boiling temperatures. The DNA double
helix can also be denatured by placing it in a salt solution of low ionic strength or by exposing
it to chemical denaturants such as urea or formamide, which destabilize DNA by forming
hydrogen bonds with the nucleotides and preventing their association with a complementary
DNA strand. Denaturation is a reversible process. If a double-stranded piece of DNA is heated
up, it will separate into its two single strands. As the DNA sample cools, the single DNA strands
will find their complementary sequence and rehybridize or anneal to each other. The process
of the two complementary DNA strands coming back together is referred to as renaturation or
reannealing.
pairs of chromosomes. Males are designated XY because they contain a single copy of the X
chromosome and a single copy of the Y chromosome, while females contain two copies of the
X chromosome and are designated XX. Most human identity testing is performed using
markers on the autosomal chromosomes, and gender determination is done with markers on
the sex chromosomes. The Y chromosome and mitochondrial DNA, a small, multi-copy
genome located in cell’s mitochondria, can also be used in human identification applications.
Chromosomes in all body (somatic) cells are in a diploid state; they contain two sets of each
chromosome. On the other hand, gametes (sperm or egg) are in a haploid state; they have only
a single set of chromosomes. When an egg cell and a sperm cell combine during conception,
the resulting zygote becomes diploid. Thus, one chromosome in each chromosomal pair is
derived from each parent at the time of conception.
Mitosis is the process of nuclear division in somatic cells that produces daughter cells, which
are genetically identical to each other and to the parent cell. Meiosis is the process of cell
division in sex cells or gametes. In meiosis, two consecutive cell divisions result in four rather
than two daughter cells, each with a haploid set of chromosomes.
The DNA material in chromosomes is composed of ‘coding’ and ‘noncoding’ regions. The
coding regions are known as genes and contain the information necessary for a cell to make
proteins. A gene usually ranges from a few thousand to tens of thousands of base pairs in size.
One of the big surprises to come out of the Human Genome Project is that humans have fewer
than 30,000 protein-coding genes rather than the 50,000 to 100,000 previously thought. Genes
consist of exons (protein-coding portions) and introns (the intervening sequences). Genes only
make up ~ 5% of human genomic DNA. Non-protein- coding regions of DNA make up the rest
of our chromosomal material. Because these regions are not related directly to making proteins,
they have been referred to as ‘junk’ DNA although recent research suggests that they may have
other essential functions. Markers used for human identity testing are found in the noncoding
regions either between genes or within genes (i.e., introns) and thus do not code for genetic
variation. Polymorphic (variable) markers that differ among individuals can be found
throughout the noncoding regions of the human genome. The chromosomal position or location
of a gene or a DNA marker in a noncoding region is commonly referred to as a locus (plural:
loci). Thousands of loci have been characterized and mapped to particular regions of human
chromosomes through the worldwide efforts of the Human Genome Project. Pairs of
chromosomes are described as homologous because they are the same size and contain the
same genetic structure. A copy of each gene resides at the same position (locus) on each
chromosome of the homologous pair.
One chromosome in each pair is inherited from an individual’s mother and the other from his
or her father. The DNA sequence for each chromosome in the homologous pair may or may
not be identical since mutations may have occurred over time. The alternative possibilities for
a gene or genetic locus are termed alleles. If the two alleles at a genetic locus on homologous
chromosomes are different, they are termed heterozygous; if the alleles are identical at a
particular locus, they are termed homozygous. Detectable differences in alleles at
corresponding loci are essential to human identity testing.
A genotype is a characterization of the alleles present at a genetic locus. If there are two alleles
at a locus, A and a, then there are three possible genotypes: AA, Aa, and aa. The AA and aa
genotypes are homozygous, whereas the Aa genotype is heterozygous. A DNA profile is the
combination of genotypes obtained for multiple loci. DNA typing or DNA profiling is the
process of determining the genotype present at specific locations along the DNA molecule.
Multiple loci are typically examined in human identity testing to reduce the possibility of a
random match between unrelated individuals.
To help understand these concepts better, consider a simple analogy. Suppose you are in a room
with a group of people and are conducting a study of twins. You instruct each member of the
group to line up matched with his or her twin (homologue). You notice that there are 22 sets of
identical twins (autosomes) and one fraternal set consisting of one boy and one girl (sex
chromosomes). You have the twin pairs rearrange their line by average height from tallest pair
of twins to the shortest with the fraternal twins at the end and number the pairs 1 through 23
beginning with the tallest. Now choose a location on one twin, say the right forearm (locus).
Compare that to the right forearm of the other twin. What is different (allele)? Perhaps one has
a mole, freckles, more hair. There are several possibilities that could make them different
(heterozygous) or perhaps they both look exactly the same (homozygous).
Fig: Two primary forms of variation exist in DNA: (a) sequence polymorphisms; (b)
length polymorphisms. The short tandem repeat DNA markers discussed in this book are
length polymorphisms
Large amounts of genetic variability exist in the human population. This is evidenced by the
fact that, with the exception of identical twins, we all appear different from each other. Hair
color, eye color, height, and shape all represent alleles in our genetic makeup. To gain a better
appreciation for how the numbers of alleles present at a particular locus impact the variability,
consider the ABO blood group. Three alleles are possible: A, B, and O. These three alleles can
be combined to form three possible homozygous genotypes (AA, BB, and OO) and three
heterozygous genotypes (AO, BO, and AB). Thus, with three alleles there are six possible
genotypes. However, because AA and AO are phenotypically equal as are BB and BO, there
are only four phenotypically expressed blood types: A, B, AB, and O.
Fig: Schematic representation of two different STR loci on different pairs of
homologous chromosomes
Fig: (a) A three-generation family pedigree with results from a single genetic locus (STR
marker FGA). Squares represent males and circles females. (b) A Punnett square
showing the possible allele combinations for offspring of individuals #1 and #2 in the
pedigree. Individual #3 is 22,23.2 and inherited the 22 allele from his father and the 23.2
allele from his mother. (c) A Punnett square for one of the families in the second
generation showing possible allele combinations for offspring of individuals #4 and #7
4.3 Techniques of DNA
DNA is quite literally ‘the stuff of life’ as it contains all the information that makes us who we
are. Indeed, some biologists suggest that the sole reason for any organism’s existence is to
ensure that its DNA is replicated and survives into the next generation. It is found in the nucleus
of cells and also the mitochondria. Therefore, with the exception of certain specialized cells
that lack these organelles, such as mature red blood cells, every cell in the body contains DNA.
Furthermore, unless heteroplasmy occurs (see later), this DNA is identical in every cell, does
not change during a person’s lifetime and is unique (identical twins excepted) to that individual.
DNA is composed of four nucleotides (adenine, thymine, cytosine and guanine), and phosphate
and sugar molecules. Nuclear DNA takes the form of a ladder twisted into the shape of a double
helix in which the rails are composed of alternating sugar and phosphate molecules whilst the
nucleotides act as rungs joining the two rails together. Adenine is always joined to thymine and
cytosine is always joined to guanine. Within the nucleus, DNA is found in structures called
chromosomes. Human cells contain 23 pairs of chromosomes and these vary in shape and size.
Twenty - two of these pairs are referred to as the ‘autosomal chromosomes’ and these contain
the information that directs the development of the body (body shape, hair colour etc.). The
remaining pair of chromosomes are the X and Y ‘sex chromosomes’ that control the
development of the internal and external reproductive organs. Each chromosome contains a
strand of tightly coiled DNA. The DNA strand is divided into small units called genes and each
gene occupies a particular site on the strand called its ‘locus’ (plural ‘loci’). The total genetic
information within a cell is referred to as its ‘genome’. There are about 35 000 – 45 000 genes
and, on average, they each comprise about 3000 nucleotides although there is a great deal of
variation. These genes code for proteins that determine our hair and eye colour, the enzymes
that digest our food and every hereditable characteristic. Surprisingly, only a small proportion
of the genome actually codes for anything and between these coding regions lies long stretches
of repetitive non - coding regions that exhibit a great deal of variability.
Each gene exists in two alternative forms, called ‘alleles’, one of which is found in each of the
pair of chromosomes. If DNA profiling detects only one allele, this is usually interpreted as a
consequence of a person inheriting the same allele from both parents. If three or more alleles
are detected then this is an indication that the sample contains DNA from more than one
individual. Mitochondrial DNA is arranged slightly differently to nuclear DNA and will be
dealt with separately.
4.3.1 DNA sampling
Because virtually all of our cells contain DNA and we lose cells all the time, for example
whenever we blow our nose, brush our teeth or comb our hair, it is possible to isolate DNA
from a wide variety source. Indeed, it is so easy to leave a trail of DNA that crime scene
investigators must wear masks, and disposable over - suits and over - shoes to avoid
contaminating the location. DNA contamination can also occur via mortuary instruments so it
is preferable that samples are taken from a body before it is moved. Similarly, all DNA samples
need to be kept apart from the moment they are collected to avoid the possibility of cross
contamination. For example, if samples from a victim and a suspect are transported in the same
container (even if they are in separate bags) or processed at the same time, there is a risk of
cross contamination. As the sensitivity of DNA analysis increases, particularly with regard to
techniques such as low copy number STR analysis the risks of contamination during the
collection, storage and processing increase and the results need to be interpreted with care (see
later). For example, DNA can be recovered from bed sheets even if a person slept on them for
only one night. This means that one could prove that a man and a woman shared a bed but not
that they shared it at the same time. The collection, transport and/or storage of liquid and tissue
samples is sometimes problematic but this can be overcome by using commercially available
FTA ® cards that are produced by Whatman International Ltd. These cards contain chemicals
that lyse any cells in the sample and immobilize and stabilize the DNA and RNA that is
released. The cards also contain chemicals that preserve the DNA and RNA and thereby allow
the samples to be stored at room temperature for long periods. The cards can be pressed against
liquid samples (e.g.,saliva or semen) or the liquid can be dropped onto them. Tissue samples
or blood clots can be squashed onto the cards. When required for analysis the DNA or RNA
can be readily extracted. It should be noted that according to the UK 2004 Human Tissues Act
it is illegal to take a sample of a person’s DNA without their consent except under certain
conditions (e.g.,to prevent or detect a crime or to facilitate a medical diagnosis). It is therefore
illegal for a man to test DNA samples surreptitiously from his offspring to determine whether
he really is the father.
Potential sources of human DNA for forensic analysis
Fig: Comparison of DNA typing technologies. Forensic DNA markers are plotted in
relationship to four quadrants defined by the power of discrimination for the genetic
system used and the speed at which the analysis for that marker may be performed
The best solution in the search for a high power of discrimination and a rapid analysis speed
has been achieved with short tandem repeat (STR) DNA markers. Because STRs by definition
are short, three or more can be analyzed at a time. Multiple STRs can be examined in the same
DNA test, or ‘multiplexed’. Multiplex STRs are valuable because they can produce highly
discriminating results and can successfully measure sample mixtures and biological materials
containing degraded DNA molecules. This method can significantly reduce the amount of
DNA required for analysis, thereby conserving more of the irreplaceable DNA collected from
forensic evidence for use by scientists from opposing counsel or for additional specialized
DNA testing. In addition, the detection of multiplex STRs is automated, which is an important
benefit as demand for DNA testing increases. Mitochondrial DNA (mtDNA), which is shown
in the quadrant with the lowest power of discrimination and longest sample processing time,
can be very helpful in forensic cases involving severely degraded DNA samples or when
associating maternally related individuals. There are instances when nuclear DNA is either so
degraded or present in such low amounts in forensic evidence samples (e.g., hair shafts or
putrefied bones or teeth) that it is either untestable or undetectable and mtDNA is the only
viable alternative forensic DNA technology that can produce interpretable data for forensic
comparisons. In other cases, such as the identification of skeletal remains, mtDNA is often the
preferred method for comparing DNA from the bone evidence to the known reference mtDNA
from potential family members to determine whether the mtDNA matches. In many situations,
multiple technologies may be used to help resolve an important case or identify victims of mass
fatalities, such as those from the World Trade Center collapse. When early methods for DNA
analysis are superseded by new technologies, there is usually some overlap as forensic
laboratories implement the new technology. Validation of the new methods is crucial to
maintaining high-quality results. The purpose of the information in this chapter is to briefly
review the historical methods mentioned above and to discuss the advantages and limitations
of each technology. By seeing how the field has progressed during the past few decades, we
can perhaps gain a greater understanding of where the field of forensic DNA typing is today
and where it may be headed in the future.
Table: lists a few of the isozymes used during the 1970s and 1980s to perform forensic
protein profiling
By the early 1980s many labs began using isoelectric focusing (IEF) polyacrylamide gel
electrophoresis, which is capable of higher resolving power than protein electrophoresis
because it produces sharper bands. Combined with silver staining (the same detection technique
used a decade later with D1S80 and early STR systems; see below), IEF can detect fairly low
levels of proteins in forensic samples. Nevertheless, proteins are not as variable as DNA nor
are they as stable in forensic evidence.
In the mid-1990s, there were a number of different DNA testing methods under evaluation and
in active use by forensic DNA laboratories. In April 1995, the United Kingdom launched their
national DNA database with six STR markers using fluorescence detection — and thus led the
way to where the world is today in terms of standardized sets of STR markers. Within the
United States, RFLP with single-locus probe VNTRs was still performed by the FBI
Laboratory (and would be until 2000) as well as a number of other forensic labs. Many labs
had also adopted and validated the reverse dot blot DQA1/PolyMarker system. Still others were
using D1S80 and triplex STR assays with silver-stain detection methods. After a few years of
using silver-stain detection, most labs around the world converted to STR markers with
fluorescence detection.
Mitochondria are intracellular organelles that generate about 90% of the energy that cells need
to survive. The numbers of mitochondria found in a human cell depend upon its energy needs
and vary from zero in the mature red blood cell to over 1000 in a muscle cell. They are thought
to descend from bacteria that evolved a symbiotic relationship with pre - or early eukaryotic
cells many hundreds of millions of years ago. With time, the symbiotic relationship became
permanent but the legacy is reflected by present day mitochondria retaining their own bacterial
type ribosomes and their own DNA (referred to as mtDNA) that is distinct from that found in
the cell nucleus. Each mitochondrion contains between 2 and 10 copies of the mtDNA genome.
The inheritance of mtDNA also differs from nuclear DNA in that it is exclusively generated
from the maternal side. This is because the sperm head is the only bit of a spermatozoon that
enters the egg at the time of fertilization. Usually, the spermatozoon’s tail and the mid piece
(which is the only bit containing mitochondria) shear off as the head enters the egg’s
perivitelline space. Occasionally, a few mid piece mitochondria are incorporated at fusion but
these are subsequently destroyed by the egg. Consequently, the only mtDNA present in the
developing embryo is that derived from the egg and therefore as usual (it is alleged) the
workforce is exclusively female.
Human mtDNA is a circular DNA molecule that contains 16 569 base pairs that code for 37
genes that in turn code for the synthesis of two ribosomal RNAs, 22 transfer RNAs and 13
proteins. Unlike nuclear DNA, the mitochondrial genome is extremely compact and about 93%
of the DNA represents coding sequences. The remaining, non - coding region is called the
control region or displacement loop (D - loop). The D - loop region consists of about 1100 base
pairs and it exhibits a higher mutation rate than the coding region and about 5 – 10 times the
rate of mutation within nuclear DNA. The mutations occur as substitutions in which one
nucleotide is replaced by another one: the length of the loop region is not changed. The
mutations result from mtDNA being exposed to high levels of mutagenic free oxygen radicals
that are generated during the mitochondrion’s energy generating oxidative phosphorylation
process. The substitutions persist because mtDNA lacks the DNA repair mechanisms that are
found in nuclear DNA. These mutations result in sequence differences between even closely
related individuals and makes analyses of the D - loop region an effective means of
identification.
Because the mtDNA is inherited only from the mother, it also allows tracing of a direct genetic
line. Furthermore, unlike the inheritance of nuclear DNA, there are no complications owing to
recombination.
The D - loop is divided into two regions, each consisting of about 610 base pairs, known as the
hypervariable region 1 (HV1) and hypervariable region 2 (HV2). It is these two regions that
are normally examined in mtDNA analysis by PCR amplification using specific primers
designed to base pair to their ends. This is then followed by DNA sequence analysis. Because
of the high rate of substitutions, it is possible to analyse just these short regions and still
differentiate between closely related sequences. It has been estimated that mtDNA may vary
about 1 – 2.3% between unrelated individuals and although mtDNA sequencing does not have
the discriminating power of STR DNA profiling, it can prove effective where STR DNA
analysis fails.
The mtDNA sequence of all the mitochondria in any one individual is usually identical – this
condition is referred to as ‘homoplasmy’. However, in some people, differences in base
sequences are found at one or more locations. These differences arise from them containing
two or more genetically distinct types of mitochondria. This condition is known as
‘heteroplasmy’ and it can have a significant impact in forensic investigations. Heteroplasmy
used to be considered relatively rare but it is now believed to occur in 10 – 20% of the
population. To make matters worse, it is now apparent that heteroplasmy is not necessarily
expressed to the same extent in all the tissues of the body. For example, two hairs from a single
person might have different proportions of the base pairs contributing to the heteroplasmy and
this might result in an exclusion rather than a match. This is because heteroplasmy may result
from the high mutation rate or from either inheritance at the germ line level or the level of
somatic cell mitosis and mtDNA replication. In forensic science, mtDNA analysis is most
frequently used where the samples do not contain much nuclear DNA – for example, a
fingerprint or a hair shaft – or where the DNA has become degraded through the decomposition
process or burning. Because there are numerous mitochondria in a single cell and each
mitochondrion contains multiple copies of the mitochondrial genome, it is possible to extract
far more mtDNA than nuclear DNA. Epithelial cells, which are the commonest cell type used
in forensic casework, contain an average of 5000 molecules of mtDNA. Mitochondrial DNA
analysis does, however, suffer from a number of problems. For example, all maternally related
individuals are likely to have the same mtDNA sequences, so the discriminating powers are
limited compared with autosomal STR analysis. Heteroplasmy can be considered either a
problem or a useful trait depending on the circumstances. It can create problems because the
mixed sequence is also what would be expected if there were more than one individual
contributing to the DNA profile. A difference of only one base pair between the mtDNA profile
of the sample and the suspect is considered insufficient to prove either a match or exclusion
whilst a difference in two or more base pairs is grounds for exclusion. By contrast,
heteroplasmy can provide an identifying characteristic where the suspect expresses the same
heteroplasmy characteristics as the sample.
Other common problems associated with mtDNA analysis are that detecting differences in
sequences is more time consuming and costly than determining differences in lengths – as is
accomplished using STR analysis. In addition, the rarity of mtDNA sequences has to be
determined by empirical studies and the results are not as statistically reliable as those for other
types of analysis. Finally, owing to the high copy number per cell there is always a risk of
contamination and cross - contamination associated with mtDNA sequencing.
4.3.8 DNA databases
The National DNA Database (NDNAD) was established in April 1995 following a
recommendation from the Royal Commission on Criminal Justice in 1993. Scotland and
Northern Ireland have their own DNA databases but export the profiles of all persons they
arrest to the NDNAD. The NDNAD is governed by a combination of the Home Office, the
Association of Chief Police Officers and the Association of Police Authorities and with invited
representatives from the Human Genetics Commission. It was the first national DNA database
to be established in the world and in 2007 it contained over 4 million profiles. This represents
about 6% of the UK population although about 10 – 13% of the profiles are thought to be
replicates (e.g., through people using aliases). The FBI’s DNA database, CODIS, contains
numerically more DNA profiles but represents only about 0.5% of the US population. The
NDNAD is run by the Forensic Science Service (FSS) under contract from the Home Office
and the FSS is the main organization that loads profiles onto the database, undertakes profile
searches and matches and reports back to the police authorities.
Fig: Illustration of question being asked with (a) parentage testing and (b) reverse
parentage testing
5
5.1 Role of Iris Biometric in personal identification
Over the past 15 years, iris recognition has developed rapidly from its first live demonstration
and first actual method patent, to a mainstream field of biometric development with a large
number of active researchers in both academia and industry. To date, some 50 million persons
worldwide have been enrolled in iris recognition systems that use the author's algorithms. But
other systems are also being developed, demonstrated, and tested in Government-sponsored
competitions with good results; and there is no doubt that in the future there will exist a lively
equilibrium of diverse methods and viable products available for deployment, maybe even
interoperable.
The iris is the colored, donut-shaped portion of the eye behind the cornea and surrounds the
pupil. A person’s iris pattern is unique and remains unchanged throughout life. Also, covered
by the cornea, the iris is well protected from damage, making it a suitable body part for
biometric authentication.
Because iris recognition is designed for use in identification mode (\one-to-many" exhaustive
search, at least with the author's algorithms) so that a user is not even asked to claim or assert
an identity, as opposed to simple verification (a \one-to-one" test of some claimed identity), the
number of iris comparisons done so far is staggering. In one particular deployment linking all
27 air, land, and sea-ports of entry into the United Arab Emirates, that compares the irises of
arriving travellers to all stored in a central database, some 5 trillion iris comparisons have been
performed since 2001. About 10 million arriving travellers have used that system, with 12
billion iris comparisons now being performed daily at a speed of about 1 million comparisons
per second per search engine. UK has recently launched Project IRIS (Iris Recognition
Immigration System) which allows travellers to enter the UK from abroad without passport
presentation or any other assertion of identity, but just by looking at an iris camera at an
automatic gate. If they have been enrolled in the system and are recognized, then their border-
control formalities are finished and the gate opens. About 200,000 travellers to the UK in recent
months have benefitted from this convenience.
Notwithstanding such existing public deployments at many airports in several countries, basic
research into alternative methods continues. The anticipated large-scale applications of
biometric technologies such as iris recognition are driving innovations at all levels, ranging
from sensors, to user interfaces, to algorithms and decision theory. At the same time as these
good innovations, possibly even outpacing them, the demands on the technology are becoming
greater. In addition to the requirement to function on a national scale to detect any multiple
identities upon issuance of biometric ID cards, expectations are also being raised for
development of more tolerant and fluid user interfaces that aim to replace the \stop and stare"
camera interface with iris recognition \on the move, off-axis, and at a distance". Scientific and
engineering literature about iris recognition grows monthly, with contributions from several
dozen university and industrial laboratories around the world. Many databases of iris images
are available for download, further stimulating research.
An excellent and comprehensive review of the expanding literature about iris recognition, with
141 references, has recently appeared. I will not attempt to duplicate such a literature survey
here. Instead, I will briefly review the historical development of iris recognition, from its
inception as a speculative idea to early efforts at commercialization, and its current drivers; and
then I will present a number of new methods that I have developed and found beneficial, which
I will illustrate here with publicly available iris images.
1. Evidence emerging in tests that iris recognition seems the biometric with best performance,
in terms of large database accuracy and search speed.
2. Legislation in several countries for national programs involving biometric ID cards, or
biometrics replacing passports in automated border-crossing.
3. NIST Iris Challenge Evaluation (\large-scale") based on images from 240 Subjects; its
training database was downloaded by 42 research groups.
4. Biometric Data Interchange Format Standards, and databases of iris images for algorithm
development and testing.
5. Numerous international conferences and books that include the topic.
6. Popular futurism and movies, from James Bond to Minority Report.
7. Cultural iconography associated with the eye (the \Window to the Soul;" affective
significance of eye contact, and communication through gaze).
8. The intellectual pleasure of solving multi-disciplinary problems combining mathematics,
information theory, computer vision, statistics, biology, ergonomics, decision theory, and
naturally occurring human randomness.
Features of iris recognition
1. Highly accurate and fast, iris recognition boasts of having top class precision among
different types of biometric authentication technologies.
2. Remains unchanged throughout life. (This does not constitute a guarantee.)
3. Since the iris is different between the left and right eye, recognition can be performed
separately by each eye.
4. Possible to distinguish twins.
5. As long as the eyes are exposed, iris recognition can be used even when the subject is
wearing a hat, mask, eyeglasses or gloves.
6. Because of using an infrared camera, recognition is available even at night or in the
dark.
7. Without the need to touch the device, contactless authentication is possible, making it
hygienic to use.
1. First, the location of the pupil is detected, followed by detection of the iris and the
eyelids.
2. Unnecessary parts (noise), such as eyelids and eyelashes, are excluded to clip out only
the iris part, which is then divided into blocks and converted into feature values to quantify
the image.
3. Matching is then performed with feature data previously extracted in the same methods.
Iris recognition begins with finding an iris in an image, demarcating its inner and outer
boundaries at pupil and sclera, detecting the upper and lower eyelid boundaries if they occlude,
and detecting and excluding any superimposed eyelashes, or reflections from the cornea or
eyeglasses. These processes may collectively be called segmentation. Precision in assigning
the true inner and outer iris boundaries, even if they are partly invisible, is important because
the mapping of the iris in a dimensionless (size-invariant and pupil dilation invariant)
coordinate system is critically dependent on this. Inaccuracy in the detection, modelling, and
representation of these boundaries can cause different mappings of the iris pattern in its
extracted description, and such differences could cause failures to match. It is natural to start
by thinking of the iris as an annulus. Soon one discovers that the inner and outer boundaries
are usually not concentric. A simple solution is then to create a non-concentric pseudo-polar
coordinate system for mapping the iris, relaxing the assumption that the iris and pupil share a
common center, and requiring only that the pupil is fully contained within the iris. This \doubly-
dimensionless pseudo-polar coordinate system" was the basis of my original paper on iris
recognition and Patent, and this iris coordinate system was incorporated into ISO Standard
19794-6 for iris data. But soon one discovers also that often the pupil boundary is non-circular,
and usually the iris outer boundary is non-circular. Performance in iris recognition is
significantly improved by relaxing both of those assumptions, replacing them with more
disciplined methods for faithfully detecting and modelling those boundaries whatever their
shapes, and defining a more flexible and generalized coordinate system on their basis. Because
the iris outer boundary is often partly occluded by eyelids, and the iris inner boundary may be
partly occluded by reflections from illumination, and sometimes both boundaries also by
reflections from eyeglasses, it is necessary to fit flexible contours that can tolerate interruptions
and continue their trajectory under them on a principled basis, driven somehow by the data that
exists elsewhere. A further constraint is that both the inner and outer boundary models must
form closed curves. A final goal is that we would like to impose a constraint on smoothness,
based on the credibility of any evidence for non-smooth curvature.
An excellent way to achieve all of these goals is to describe the iris inner and outer boundaries
in terms of “Active Contours" based on discrete Fourier series expansions of the contour data.
By employing Fourier components whose frequencies are integer multiples of 1/(2¼), closure,
orthogonality, and completeness are ensured. Selecting the number of frequency components
allows control over the degree of smoothness that is imposed, and over the fidelity of the
approximation. In essence, truncating the discrete Fourier series after a certain number of terms
amounts to low-pass filtering the boundary curvature data in the active contour model. In the
lower left- hand corner of the Figure is shown two snakes," each consisting of a fuzzy ribbon-
like data distribution and a dotted curve which is a discrete Fourier series approximation to the
data, including continuation across gap interruptions. The lower snake in each snake box is the
curvature map for the pupil boundary, and the upper snake is the curvature map for the iris
outer boundary, with the endpoints joining up at the 6-o'clock position. The interruptions
correspond to detected occlusions by eyelids (indicated by separate splines in both images), or
by specular reflections. The data plotted as the grey level for each snake is the image gradient
in the radial direction. Thus, the relative thickness of each snake represents roughly the
sharpness of the corresponding radial edge. If an iris boundary were well-described as a circular
edge, then the corresponding snake in its box should be flat and straight. In general, this is not
the case.
Fig: Active contours enhance iris segmentation, because they allow for non- circular
boundaries and enable flexible coordinate systems. The box in the lower- left shows
curvature maps for the inner and outer iris boundaries, which would be flat and straight
if they were circles. Here the outer boundary (upper plot) is particularly non-circular.
Dotted curves in the box and on the iris are Fourier series approximations
4.4 Fourier-based Trigonometry and Correction for Off-Axis Gaze
A limitation of current iris recognition cameras is that they require an on-axis image of an eye,
usually achieved through what may be called a \stop and stare" interface in which a user must
align her optical axis with the camera's optical axis. This is not as flexible or fluid as it might
be. Moreover, sometimes the standard cameras acquire images for which the on-axis
assumption is not true. For example, the NIST iris images that were made available and used
for training in ICE-1 contained several with very deviated gaze, probably because the user's
gaze was distracted by an adjacent monitor. The on-axis requirement can be relaxed by
correcting the projective deformation of the iris when it is imaged off-axis, provided that one
can reliably estimate the actual parameters of gaze. The gaze parameters that we seek include
two spherical angles for eye pose, but the projective geometry depends also on the distance
between eye and
Fig: Active contours enhance iris segmentation, because they allow for non- circular
boundaries and enable flexible coordinate systems. The box in the lower- left shows
curvature maps for the inner and outer iris boundaries, which would be flat and
straight if they were circles. Here the pupil boundary (lower plot) is particularly non-
circular. Dotted curves in the box and on the iris are Fourier series approximations
camera which may be unknown, and it depends on the surface curvature of the iris which is
generally not zero. If simplifying assumptions and approximations are made about the latter
factors, then a simple affine projective transformation may suffice to make the iris recognizable
Fig: Gaze estimation enables transformation of an eye image with deviated gaze, into
one apparently looking directly at the camera. Without this transformation, such
images would fail to be matched
against itself as imaged in other poses, orthographic or not. The essence of the problem is then
estimating the two angles of gaze relative to the camera. Eye morphology is so variable in terms
of visible sclera and eyelid occlusion that it is unlikely that such factors could support robust
estimation, at least when only one eye is imaged; although it must be noted that humans are
very impressively skilled somehow at monitoring each other's gaze direction. In the absence of
solving that mystery, an obvious alternative approach would be to assume that an orthographic
image of the iris should reveal a circular pupil; therefore, detecting ellipticity of the pupil
indicates off-axis image acquisition, and so estimating the elongation and orientation of that
ellipse would yield the two parameters of gaze deviation, modulo ¼ in direction. We present
here a somewhat more robust variant of this idea, which does not assume that the true pupil
shape is circular when viewed orthographically. This method of estimating gaze (and thus
correcting for off-axis imaging) uses a new approach that may be called \Fourier-based
trigonometry."
Fig: Statistical inference of eyelashes from the iris pixel histogram, and determination
of a threshold for excluding the lashes (labelled in white) from influencing the IrisCode
Application of Iris Biometrics
1. Immigration Control
In response to the increasing threats of terrorism around the world, iris biometrics
contributes to a safe and secure society by enhancing stringency of immigration control.
iris recognition offers improved security and smooth personal identification amidst the
increasing movement of people between countries.
The iris is photographed, and the image is matched with the government’s immigration
control database during exit or entry procedures at the passport control booth, enabling
rapid and stringent personal authentication.
2. National IDs
Iris recognition is used as one of the methods for acquiring biometric data needed for
issuing unique IDs.
Accurate and fast authentication is possible even without an ID card.
Combined use with mutually complementary biometric data, such as fingerprint and
face, enables rigid personal authentication and a robust approach against impersonation.
3. Crime Investigation
2. Finance and Banking: With the rapid growth of ATM services and credit cards,
fraudulent withdrawal of money by using fake or stolen bankcards has become a serious
problem. Hand vascular pattern technology can be integrated into banking solutions by two
different methods. In the first method, vascular patterns of customers are stored in the bank's
database server. The authentication is carried out by comparing a customer's hand vascular
pattern with their enrolled pattern in the database server. In the second method, hand vascular
patterns of customers are stored in biometric ID cards which are kept by customers. During
authentication, the customer's hand vascular pattern is compared with the pattern stored in the
card for verification. Based on various requirements such as timely response or level of
security, banks will decide the appropriate method for their solutions.
3. Travel and Transportation: Since the 9/11 terrorist attack, national security problems
are of great concern in almost every country. Many security fences have been established in
order to avoid the infiltration of terrorists. Access to many sensitive areas such as airports, train
stations, and other public places are being closely monitored. Hand vascular pattern technology
has been chosen to provide a secure physical access control in many of these areas. Due to its
superior authentication performance, ease of use, and user satisfaction, the hand vascular
system was adopted by Incheon International Airport, the largest airport in Korea, and by
several major international airports in Japan for physical access control.
Fig: General applications of hand vascular pattern technology; (a) Door access control,
(b) Banking solutions, (c) Transportation (airport security), (d) Hospitals, (e)
Construction sites, and (f) Schools
4. Hospitals: Many areas of a hospital require tight security, including medicine cabinets
and storage rooms, operating rooms, and data centres where patient records are managed and
stored. Some sensitive data such as those related to research studies on dangerous virus may
be used with dire consequences if it falls into terrorist hands. Consequently, biometric security
methods should be used to protect such sensitive data. Many hospitals have installed hand
vascular systems as means for physical access control.
5. Construction Sites: Unlike other biometric traits which can be adversely affected by
external factors such as dirt or oil, the hand vascular pattern is robust to these sources of noise
because it lies under the skin of human body. Therefore, the hand vascular pattern technology
is appropriate for use in environments such as factories or construction sites.
6. Schools: The commonly used RF ID cards do not offer high levels of security because
people tend to lose them or fail to return their cards. As a result, many universities have adopted
hand vascular pattern recognition systems to enhance security for valuable equipment in
research laboratories and private belongings in dormitories. It is not only more cost-effective
in the long term but also provides an enhanced level of security through individual
identification and managerial convenience.
In recent years, many hand vascular pattern recognition systems have been deployed in civilian
applications in hospitals, schools, banks, or airports. However, the widest use of hand vascular
pattern recognition is for security management in highly secure places like airports. The typical
deployment of hand vascular pattern recognition systems can be found at Incheon International
Airport, Korea. Incheon International Airport opened for business in early 2001 and became
the largest international airport for international civilian air transportation and cargo traffic in
Korea. After September 11 of 2001 when the terrorist hijackings occurred, the airport's security
system was upgraded to advanced and state-of-the-art security facilities in response to terrorist
threats and various epidemics in southern Asia. The primary goal in selecting hand vascular
pattern recognition systems was to establish a high security access management system and
ensure a robust and stringent employee identification process throughout their IT system. The
configuration of hand vascular pattern recognition systems at Incheon Airport is divided into 3
major areas: enrollment center, server room and entry gates between air and land sides. The
control tower is also access controlled by vascular biometrics.
5.3 Technology
Hand vascular patterns are the representation of blood vessel networks inside the back of hand.
The hand vascular pattern recognition system operates by comparing the hand vascular pattern
of a user being authenticated against a pre-registered pattern already stored in the database.
hand blood vessels from the back of the hand. The near-infrared rays of the camera illuminate
the back of the hand. The deoxidized hemoglobin in blood vessels absorbs the infrared rays
and causes the vascular patterns to appear as black patterns in resulting images. The vascular
patterns are then extracted by various digital signal processing algorithms. The extracted
vascular pattern is then compared against pre-registered patterns in smart storage devices or
database servers to authenticate the individual. Major steps in a typical hand vascular pattern
recognition system are image acquisition, feature extraction, and pattern matching.
Fig: Operation of a typical vascular biometric identification system
Fig: The hand image obtained by visible light (left) and infrared light (right)
To capture the image of blood vessels under near-infrared light, the scanner uses an LED array
to emit the light and illuminate the hand. A CCD camera sensitive to near-infrared light is used
to photograph the image. A near-infrared filter attached in front of the CCD camera is used to
block all undesired visible light emitted by external sources. The image of blood vessels can
be acquired by either reflection or transmission.
1. Transmission method: The hand is illuminated by an LED array and the CCD camera
captures the light that passes through the hand. To use this method, the LED array is above the
hand and the CCD camera is placed on the opposite side of the LED array with respect to the
hand.
2. Reflection method: Here the hand is illuminated by an LED array and the CCD camera
captures the light that is reflected back from the hand. So, the illumination LED array and the
CCD camera are positioned in the same location. The reflection method is preferred since the
transmission method is often sensitive to changes in the hand's light transmittance, which is
easily affected by temperature or weather. If the hand's light transmittance is relatively high,
the blood vessels are not very clear in captured images. In contrast, the light transmittance does
not significantly affect the level or contrast of the reflected light. Another reason why the
reflection method is preferred is due to its easy configuration. Since the illumination LED array
and the CCD camera can be located in the same place, the system is easy to embed into small
devices.
Fig: Flow chart of the direction-based vascular pattern extraction algorithm. Image is
from
There has been an ever-growing need to automatically authenticate individuals for various
applications, such as information confidentiality, homeland security, and computer security.
Traditional knowledge-based or token-based personal identification or verification is
unreliable, inconvenient, and inefficient. Knowledge-based approaches use something that you
know" to make a personal identification, such as password and personal identity number.
Token-based approaches use something that you have" to make a personal identification, such
as passport or ID card. Since those approaches are not based on any inherent attributes of an
individual to make the identification, they cannot differentiate between an authorized person
and an impostor who fraudulently acquires the “token" or “knowledge" of the authorized
person. This is why biometric systems have become prevalent in recent years. Biometrics
involves identifying an individual based on his/her physiological or behavioral characteristics.
Many parts of our body and various behaviours are embedded with information for personal
identification. In fact, using biometrics for person authentication is not new, it has been
implemented for thousands of years. Numerous research efforts have been aimed at this subject
resulting in the development of various techniques related to signal acquisition, feature
extraction, matching and classification. Most importantly, various biometric systems including
fingerprint, iris, hand geometry, voice and face recognition systems have been deployed for
various applications. According to the International Biometric Group (IBG, New York), the
market for biometric technologies will nearly double in size this year alone. Among all
biometrics, hand-based biometrics, including hand geometry and fingerprint, are the most
popular biometrics gaining 60% market share in 2003.
The palmprint system is a hand-based biometric technology. Palmprint is concerned with the
inner surface of a hand. A palm is covered with the same kind of skin as the fingertips and it is
larger than a fingertip in size. Many features of a palmprint can be used to uniquely identify a
person, including
(a) Geometry Features: According to the palm's shape, we can easily get the corresponding
geometry features, such as width, length and area. (b) Principal Line Features: Both location
and form of principal lines in a palmprint are very important physiological characteristics for
identifying individuals because they vary little over time. (c) Wrinkle Features: In a palmprint,
there are many wrinkles which are different from the principal lines in that they are thinner and
more irregular. (d) Delta Point Features: The delta point is defined as the center of a delta-like
region in the palmprint. Usually, there are delta points located in the finger-root region. (e)
Minutiae Features: A palmprint is basically composed of the ridges, allowing the minutiae
features to be used as another significant measurement. It is quite natural to think of using
palmprint to recognize a person, similar to fingerprint, hand geometry and hand vein.
Fig: Localizing the salient region of the palm. H1 and H2 are the boundary of the gaps
between the two fingers, and T1 and T2 are the tangent points of H1 and H2, respectively.
The central part is extracted at a desired distance from line joining T1 and T2
symmetrically positioned about a perpendicular line passing through the mid-point of T1
and T2
6.2.2 Feature extraction
A single circular zero DC (direct current) Gabor filter is applied to the preprocessed palmprint
images and the phase information is coded as a feature vector called PalmCode.
6.2.3 Feature Matching
Feature matching determines the degree of similarity between the identification template and
the master template. In this work, the normalized Hamming distance is implemented for
comparing two Fusion Codes.
6.3 Robustness
As a practical biometric system, in addition to accuracy and speed, robustness of the system is
important. Here, we present three experiments to illustrate the robustness of our system. The
first tests the effects of jewellery such as rings, on the accuracy of some preprocessing
algorithms. The second tests noise on the palmprints, which directly affects the performance
of the system. The third experiment tests the ability of the system to identify palmprints of
identical twins.
A test of identical twins is regarded as an important test for biometric authentication that not
all biometrics, including face and DNA, can pass. However, the palmprints of identical twins
have enough distinctive information to distinguish them.
Fig: Identical twins palmprints. (a), (b) are their left hands, and (c), (d) are their right
hands, respectively
7. Role of Facial Biometric in personal identification
Robust face recognition systems are in great demand to help fight crime and terrorism. Other
applications include providing user authentication for access control to physical and virtual
spaces to ensure higher security. However, the problem of identifying a person by taking an
input face image and matching with the known face images in a database is still a very
challenging problem. This is due to the variability of human faces under different operational
scenario conditions such as illumination, rotations, expressions, camera viewpoints, aging,
makeup, and eyeglasses. Often, these various conditions greatly affect the performance of face
recognition systems especially when the systems need to match against large scale databases.
This low performance on face recognition prevents systems from being widely deployed in real
applications (although many systems have been deployed, their use and accuracy is limited to
particular operational scenarios) where errors like the false acceptance rate (FAR) and the false
rejection rate (FRR) are considered in advance. FAR is the probability that the systems
incorrectly accept an unauthorized person, while FRR is the probability that the systems
wrongly reject an authorized person.
Recently, 3D face recognition has gained attention in the face recognition community due to
its inherent capability to overcome some of the traditional problems of 2D imagery such as
pose and lighting variation. Commercial 3D acquisition devices can obtain a depth map (3D-
shape) of the face. These usually require the user to be in very close proximity to the camera;
additionally, some devices will require the user to be still for several seconds for a good 3D
model acquisition. In contrast 2D face acquisition can work from a distance and not require
significant user co-operation. This is the trade-off with desiring to work with 3D shape data.
The fusion of visual and thermal face recognition can be found at reporting multi-model based
face recognition systems lead to improved performance than single modality systems.
Fig: Simple Process of Face recognition system
The technology for Face Recognition system can vary but steps remain more or less similar in
all the conditions:
Step 1. A picture of your face is captured from a photo or video. Your face might appear alone
or in a crowd. Your image may show you looking straight ahead or nearly in profile.
Step 2. Facial recognition software reads the geometry of your face. Key factors include the
distance between your eyes and the distance from forehead to chin. The software identifies facial
landmarks — one system identifies 68 of them — that are key to distinguishing your face. The
result: your facial signature.
Step 3. Your facial signature — a mathematical formula — is compared to a database of known
faces. And consider this: at least 117 million Americans have images of their faces in one or
more police databases.
Step 4. A determination is made. Your faceprint may match that of an image in a facial
recognition system database.
7.1 Face Recognition Techniques
Face recognition algorithms can be classified into two broad categories according to feature
extraction schemes for face representation: feature-based methods and appearance-based
methods. Properties and geometric relations such as the areas, distances, and angles between
the facial feature points are used as descriptors for face recognition. On the other hand,
appearance-based methods consider the global properties of the face image intensity pattern.
Typically, appearance-based face recognition algorithms proceed by computing basis vectors
to represent the face data efficiently. In the next step, the faces are projected onto these vectors
and the projection coefficients can be used for representing the face images.
7.2 Databases
There are several publicly available face databases for the research community to use for
algorithm development, which provide a standard benchmark when reporting results. Different
databases are collected to address a different type of challenge or variations such as
illumination, pose, occlusion, etc.
7.2.3 AR database
The AR face database was created by the Computer Vision Center (CVC), at ‘Universitat of
Autµonoma de Barcelona’. It contains over 4,000 color images corresponding to 126 people's
faces (70 men and 56 women). The images acquired are frontal view pose with different facial
expressions, illumination conditions, and occlusions (such as people wearing sun glasses and
a scarf) making this database one of the more popular ones for testing face recognition
algorithms in the presence of occlusion. No restrictions on wear (clothes, glasses, etc.), make-
up, hair style, etc. were imposed to the participants. Each person participated in two sessions,
two weeks apart.
7.3 Applications
Nowadays, industry integrates cutting‐edge, face recognition research into the development of
the latest technologies for commercial applications.
7.3.1 Security
Face recognition is one of the most powerful processes in biometric systems and is extensively
used for security purpose in tracking and surveillance, attendance monitoring, passenger
management at airports, passport de‐duplication, border control and high security access
control as developed by companies like Aurora.
AFR (Automated Face Recognition) is applied in forensics for face identification, face retrieval
in still image databases or CCTV sequences, or for facial sketch recognition. It could also help
law enforcement through behaviour and facial expression observation, lie detection, lip
tracking and reading.
Moreover, AFR is now used in the context of ‘Biometrics as a Service’, within cloud‐based,
online technologies requiring face authentication for trustworthy transactions. For
example, MasterCard developed an app which uses selfies to secure payments via mobile
phones. In this MasterCard’s app, AFR is enhanced by facial expression recognition as the
application requires the consumer blinks to prove that s/he is human.
7.3.2 Multimedia
In our today's life, AFR engines are embedded in a number of multi‐modal applications such
as aids for buying glasses or for digital make‐up and other face sculpting or skin smoothing
technologies, e.g., designed by Anthropics.
In social media, many collaborative applications within Facebook, Google or Yahoo! are
calling upon AFR. Applications such as Snapchat require AFR on mobile. With 200 million
users of which half of those engage on daily basis, Snapchat is a popular image messaging and
multimedia mobile application, where ‘snaps’, i.e., a photo or a short video, can be edited to
include filters and effects, text caption and drawings. Snapchat has features such as the ‘Lens’,
which allows users to add real‐time effects into their snaps by using AFR technologies, and
‘Memories’ which searches content by date or using local recognition systems.
Other multimedia applications are using AFR, e.g., in face naming to generate automated
headlines in Video Google, in face expression tracking for animations and human‐computer
interfaces (HCI), or in face animation for socially aware robotics. Companies such as Double
Negative Visual Effects or Disney Research propose also AFR solutions for face synthesis and
face morphing for films and games visual effects.
8. Role of Voice Biometrics in personal identification
Recent data on mobile phone users all over the world, the number of telephone landlines in
operation, and recent VoIP (Voice over IP networks) deployments, confirm that voice is the
most accessible biometric trait as no extra acquisition device or transmission system is needed.
This fact gives voice an overwhelming advantage over other biometric traits, especially when
remote users or systems are taken into account. However, the voice trait is not only related with
personal characteristics, but also with many environmental and sociolinguistic variables, as
voice generation is the result of an extremely complex process.
Thus, the transmitted voice will embed a degraded version of speaker specificities and will be
influenced by many contextual variables that are difficult to deal with. Fortunately, state-of-
the-art technologies and applications are presently able to compensate for all those sources of
variability allowing for efficient and reliable value-added applications that enable remote
authentication or voice detection based just in telephone-transmitted voice signals.
8.1 Applications
Due to the pervasiveness of voice signals, the range of possible applications of voice biometrics
is wider than for other usual biometric traits. We can distinguish three major types of
applications which take advantage of the biometric information present in the speech signal:
1. Voice authentication (access control, typically remote by phone) and back- ground
recognition (natural voice checking).
2. Speaker detection (e.g., blacklisting detection in call centres or wiretapping and
surveillance), also known as speaker spotting.
3. Forensic speaker recognition (use of the voice as evidence in courts of law or as intelligence
in police investigations).
8.2 Technology
The main source of information encoded in the voice signal is undoubtedly the linguistic
content. For that reason, it is not surprising that depending on how the linguistic content is used
or controlled, we can distinguish two very different types of speaker recognition technologies
with different potential applications.
Firstly, text-dependent technologies, where the user is required to utter a specific key-phrase
(e.g., \Open, Sesame") or sequence have been the major subject of biometric access control and
voice authentication applications. The security level of password-based systems can then be
enhanced by requiring knowledge of the password, and also requiring the true owner of the
password to utter it. In order to avoid possible theft recordings of true passwords, text-
dependent systems can be enhanced to ask for random prompts, unexpected to the caller, which
cannot be easily fabricated by an impostor.
The second type of speaker recognition technologies are those known as text-independent.
They are the driving factor of the remaining two types of applications, namely speaker
detection and forensic speaker recognition. Since the linguistic content is the main source of
information encoded in the speech, text-independency has been a major challenge and the main
subject of re- search of the speaker recognition community in the last two decades. The NIST
SRE (Speaker Recognition Evaluations) conducted yearly since 1996, have fostered excellence
in research in this area, with extraordinary progress obtained year by year based in blind
evaluation with common databases and protocols, and very specially the sharing of information
among participants in the follow-up workshop after each evaluation.
8.2.1.5 Parameterization
This short-time hamming windowed signals have all of the desired temporal/spectral
information, albeit at a high bit rate (e.g., telephone speech digitized with sampling frequency
8 kHz in a 32ms. window means 256 samples x 16 bits/sample = 4096 bits = 512 bytes per
frame). Linear Predictive Coding (LPC) of speech has proved to be a valid way to compress
the spectral envelope in an all-pole model (valid for all non-nasal sounds, and still a good
approximation for nasal sounds) with just 10 to 16 coefficients, which means that the spectral
information in a frame can be represented in about 50 bytes, which is 10% of the original bit
rate. Instead of LPC coefficients, highly correlated among them (covariance matrix far from
diagonal), pseudo-orthogonal cepstral coefficients are usually used, either directly derived as
in LPCC (LPC-derived Cepstral vectors) from LPC coefficients, or directly obtained from a
perceptually-based mel-filter spectral analysis as in MFCC (Mel-Frequency based Cepstral
Coefficients). Some other related forms are described in the literature, as PLP (Perceptually
based Linear Prediction), LSF (Line Spectral Frequencies) and many others, not detailed here
for simplicity. By far, one of the main factors of speech variability comes from the use of
different transmission channels (e.g., testing telephone speech with microphone-recorded
speaker models). Cepstral representation has also the advantage that invariant channels add a
constant cepstral offset that can be easily subtracted (CMS-Cepstral Mean Subtraction), and
non-speech cepstral components can also be eliminated as done in RASTA filtering of cepstral
instantaneous vectors. In order to take coarticulation into account, delta (velocity) and delta-
delta (acceleration) coefficients are obtained from the static window-based information,
computing an estimate of how each frame coefficient varies across adjacent windows (typically
between §3, no more than §5).
6
5
NEW AND DEVELOPING FORMS OF
BIOMETRIC IDENTIFICATION
Introduction
Scientific advancement and ongoing security requirements have led to the develop-
ment of new forms of biometric identification. Biometric identification technolo-
gies are improving and becoming less expensive, allowing for wider adoption and
increased accuracy. This chapter describes the introduction of new and emerging
biometric modalities, including advancements in physiological (first generation) and
behavioural (second generation) biometric modalities. First, new developments in
physiological forms of identification are considered, including ear, vascular, ocular
(retina and iris) and voice recognition. Subsequently, the developing field of
behavioural biometrics is reviewed, including a discussion of gait recognition,
keystroke dynamics and cognitive biometrics. The principles, application and issues
associated with each new biometric modality are outlined, demonstrating a range
of possible applications in crime and security, including advantages and dis-
advantages that should be considered. Concerns associated with the security of new
biometric systems are examined, along with other related issues.
Ear recognition
Principles
Ear recognition involves the automated extraction and comparison of the anato-
mical features of the human ear for the purposes of identification and verification
(Pun & Moon, 2004). The use of the human ear to identify individuals was first
suggested by French criminologist Alphonse Bertillon (1853–1914), who used
measurements of the ear in his Bertillonage system to identify recidivists, and the
first system of ear recognition was developed in 1949, integrating 12 measurements
of the outer ear (Abaza et al., 2013).
Ear recognition involves the extraction and comparison of the unique features of
the outer ears. Human ears can be used as unique identifiers because human ear
growth is proportional to age, does not change radically across the lifespan and is not
influenced by changes in expression (Anwar, Ghany &, 2015). Human ears are
unique among individuals, including identical twins, making them a suitable
biometric (Pflug & Busch, 2012). Researchers have obtained 98 per cent accuracy in
identification using ear recognition in controlled environments (Anwar et al., 2015).
One of the main advantages of ear recognition over facial recognition is that ear
recognition requires a smaller image size, and therefore of similar resolution
meaning that it requires less memory for image storage and processing (Pun &
Moon, 2004). Further, in comparison with faces, ears have greater uniformity in
colour distribution, and less variability as a result of changes in expression (Pun &
Moon, 2004). It is believed that ear recognition is the most promising biometric
modality to be combined with facial recognition systems, as it can provide addi-
tional information on both sides of the face (Abaza et al., 2013). When combined
with facial recognition systems, ear recognition can provide further contextual
information to offset some of the adverse impacts and barriers to facial recognition
accuracy such as illumination, pose and change of expression (Wang et al., 2012).
In comparison with other modalities of biometric identification, ear recognition
does not require specialist imaging equipment, is contactless, less invasive and stable
over time (Pun & Moon, 2004).
Principles
Vein or vascular pattern recognition (VPR) involves the imaging, extraction and
comparison of subcutaneous vascular networks located under the skin, usually in
hands and fingers, for the purposes of verifying identity. Vein recognition differs
from other forms of first generation biometrics as it uses a non-visible physiological
characteristic for the purposes of authentication. An infrared light source and
infrared camera are used to identify the vein pattern concealed under the skin. The
haemoglobin present in blood reflects the infrared light and provides visibility of
the blood vessels (Smorawa & Kubanek, 2015). After the structure of the veins is
obtained, vein recognition involves the same comparison methods as biometric
fingerprinting (Smorawa & Kubanek, 2015). Like fingerprints, the pattern of blood
vessels is unique and stable across an individual’s life. The use of finger and palm veins
consistently demonstrates very high rates of identification accuracy in comparison with
fingerprint identification (Benziane & Benyettou, 2016).
and given the high level of security it offers, and its convenience, it is expected that
its applications will increase over time. Disadvantages of vein recognition include the
necessity of infrared cameras and the potential for the ambient environmental tem-
perature and medical conditions to affect its accuracy (Benziane & Benyettou, 2016).
Occular biometrics
Principles
Ocular biometrics involve the extraction and comparison of the anatomical features of
the eye. The main structures of the eye include the cornea, lens, optic nerve, retina,
pupil and iris. To date, iris recognition is the most common application; however,
increasing attention is being paid to retina recognition, particularly as it is considered to
be one of the most secure biometric modalities (Nigam, Vatsa & Singh, 2015).
Retina recognition utilises the vascular pattern of the retina. A retinal scanner is used
to illuminate a region of the retina through the pupil, capturing the vascular pattern of
the retina (Borgen, Bours & Wolthusen, 2009). Retina recognition is believed to be
the most secure form of biometrics, due to its stability, uniqueness and the fact that it is
very difficult to copy and replicate the vascular pattern of the retina (Waheed et al.,
2016). However, retina recognition has not been widely adopted to date because of
the cost of the highly specialised equipment and high levels of cooperation required of
users. Due to the high level of security it offers, retina recognition has most commonly
been implemented in military and nuclear facilities (Nigam et al., 2015).
The iris is the coloured part of the eye situated between the pupil and the sclera
(the white part of the eye) (Ives et al., 2013). It consists of a series of layers of blood
vessels that form distinct and complex patterns. The unique lattice of the iris forms
at eight months gestation, and remains stable throughout an individual’s life, with
the exception of disease or trauma. The iris is therefore more stable than other
forms of biometric modalities, such as faces and fingerprints. Iridial patterns are not
only unique for each individual, but also between each eye (Pierscionek, Crawford
& Scotney, 2008). Iris recognition involves the capture, extraction and comparison
of these patterns. As is the case with other forms of biometric identification, the
main stages of iris recognition include image acquisition, feature or pattern extraction,
template generation and comparison (Ives et al., 2013).
Iris recognition is a non-invasive procedure: multiple frames of high-resolution
grey scale images are required, illuminated with infrared or, in some cases, visible
light (Borgen et al., 2009). Ongoing development of sensor technology has enabled
more flexibile iris recognition systems; however, current iris recognition technology
limits collection distances to approximately 30 centimetres (Nigam et al., 2015).
sectors. There has been a large-scale adoption of iris recognition technology for
security applications, particularly in the areas of border control (Borgen et al.,
2009). Iris scanners are currently deployed in many major airports around the
world (Pierscionek et al., 2008). The United Arab Emirates (UAE) uses iris recog-
nition at land, sea and air border points, and maintains a database of 1.1 million iris
templates, one of the largest databases of its kind in the world. Between 2010 and
2013, the UAE conducted iris searches for 10.5 million individuals, identifying
124,435 people who were attempting to return to the UAE with forged identifi-
cation documents (Ives et al., 2013). It has also been suggested that iris recognition
could be used to reduce electoral fraud. There have been trials of iris based
recognition for voter registration systems in African countries (Bowyer, Ortiz &
Sgroi, 2015). Lecher & Brandom (2016) report that a Federal Bureau of Investi-
gation (FBI) pilot programme, beginning in 2013, has collected iris scans from over
434,000 residents of the United States. This information is obtained via information-
sharing arrangements with local law enforcement agencies, US Border Patrol and
the US military.
Iris and retina recognition have the advantage of high levels of accuracy; how-
ever, there are some questions about the stability of iris patterns over time, due, for
example, to medication, surgery, disease and aging (Pierscionek et al., 2008). These
include glaucoma, macular degeneration, cataracts and pathological angiogenesis
(Borgen et al., 2009). An area of potential future development in iris recognition
technology is the capture of iris images while a subject is in motion or is unco-
operative (Colores et al. 2011), and the development and use of mobile platforms
to capture iris images (Barra et al., 2015). A recent development in this area are
walk-through systems that can capture iris images without the subject stopping or
removing glasses (Ives et al., 2013). Studies have even shown some success in
capturing iris patterns while subjects were wearing sunglasses (Latman & Herb, 2013).
It is anticipated that in the future there will be greater adoption of walk-through iris
recognition in transportation, immigration and government facilities, as well as the
development of mobile or portable iris recognition systems (Ives et al., 2013).
In mid-2017, it was reported that the iris recognition system on widely used
smartphones could easily be circumvented by using the night mode of a digital
camera to take an infrared picture of the phone user’s eyes from a moderate dis-
tance, print-out a life size picture and hold it in front of the phone (Meyer, 2017).
Examples such as this highlight the importance of further investment in measures
to counter circumvention, otherwise the substantial amounts spent on research and
development may be undermined upon release of the technology.
Voice recognition
Principles
Voice recognition applies the individual characteristics of the human voice as a
biometric identifier through the extraction and comparison of voice samples
76 New forms of biometric identification
(Galka, Masior & Salasa, 2014). Unlike other forms of biometric analysis and
identification, this biometric involves a combination of both physiological and
behavioural characteristics. One of the main advantages of speaker recognition is
that there are several voice characteristics that can be analysed and compared,
enabling a high level of accuracy in identification (Morgen, 2012).
The physiological characteristics of human voices relate to anatomical differences
in the biological structure of the vocal tract. There are three main areas of the vocal
tract that are known as the infraglottic, glottal and supraglottic areas that influence
voice production. When a person speaks, the effects of air pressure, muscle tension
and elasticity of the vocal folds are modulated to create different sounds. The
frequency of sound pressure patterns are analysed for biometric identification
purposes. The behavioural features of voices are influenced by how an individual
has learned to speak, including their vocabulary, accent, intonation, pitch, pro-
nunciation and conversational patterns (Mazaira-Fernandez, Alvarez-Marquina &
Gomez-Vilda, 2015).
Gait recognition
Principles
Gait recognition is situated within the broader field of human motion analysis,
involving the examination and comparison of human kinesiology (Neves et al.,
2016). Everyone has a unique and regular pattern of motion when walking, relating
to the movement of their limbs. Gait recognition involves the measurement, analysis
and comparison of human movement made by an individual when they walk
(Chaurasia et al., 2015).
Gait recognition is one of the more recent forms of biometric identification to
be developed and coincides with computer processing advancements (Nixon &
Carter, 2006). There are a number of stages involved in gait recognition. These
involve capturing a walking sequence captured from video input, creating a
movement silhouette and the extracting of static and dynamic features across a
sufficient period of time. Movement silhouettes are transformed into a gait cycle,
depicting a sufficient walking period to be used for the purposes of comparison and
identification (Indumathi & Pushparani, 2016). In addition to walking patterns, gait
recognition systems can also collect and analyse the physical appearance of indivi-
duals such as the height, length of limbs, shape and size of torso (Zhang, Hu &
Wang, 2011). Identification therefore occurs through both shape (physiological
features) and motion (behavioural features) (Nixon & Carter, 2006; Choudhury &
Tjahjadi, 2013).
Gait recognition technology has reached 90 per cent accuracy in identification,
provided there are analogous environmental conditions in the comparison footage.
However, the walking surface and clothing can influence the recognition rates
(Nixon & Carter, 2006). Different camera viewpoints can also improve the rate of
identification accuracy. For example, in a study of gait recognition published in
2016, Bouchrika and colleagues obtained an identification accuracy rate of 73 per
cent for gait features extracted from individual camera viewpoints, which could be
increased to an identification accuracy rate of 92 per cent with cross-camera
78 New forms of biometric identification
these issues; however, in general, gait recognition can provide an important addi-
tional layer of security when used in combination with other modalities, such as
face or footprint biometrics (Chaurasia et al., 2015; Katiyar et al., 2013).
Keystroke dynamics
Principles
Keystroke dynamic recognition enables authentication via the identification of
individual typing characteristics and patterns, including key press durations (Revett,
2009). Although keystroke dynamic recognition was first invented in the 1980s, it
is now being used more frequently, in line with the increased use of computers and
the expansion of the Internet (Rudrapal, Das & Debbarma, 2014). Like other
forms of behavioural biometrics, keystroke dynamics are generally considered less
reliable than physiological biometrics due to the variability of this type of human
behaviour (Revett, 2009).
At the enrolment stage of keystroke dynamic recognition, individual typing
characteristics are extracted to create a digital typing signature (Revett, 2009). At
enrolment the user is typically asked to repeatedly enter their details to extract the
typing profile (Revett, 2009). These characteristics are used to develop a profile of
an individual user that forms a reference for future verification (Revett, 2009).
However, some researchers have argued that keystroke latency and duration is not
sufficient for authentication, and proposed other combinations of typing char-
acteristic metrics (Rudrapal et al., 2014; Ngugi, Tarasewich & Reece, 2012). A
combination of different metrics results in higher authentication accuracy (Ngugi
et al., 2012).
Keystroke dynamic recognition is less accurate than other forms of biometric
recognition; however, it is difficult to compare accuracy rates for keystroke
dynamic recognition across the literature, as different studies use a variety of
metrics. Reliability is directly related to the length of text typed. For example, in a
study by Bergadano, Gunetti & Picardi (2002) a false negative rate of 4 per cent
and a false positive rate of less than 0.01 per cent were obtained. However, in this
study, the participants were required to type 683 characters, a length that would be
too long for a password, and may inhibit wide-scale adoption, depending on the
context. If keystroke dynamics are used for short passwords, this raises questions
about the accuracy of the authentication (Ngugi et al., 2012).
Cognitive biometrics
Principles
Cognitive biometrics are defined as ‘methods and technologies for the recognition
of humans, based on the measurement of signals generated directly or indirectly
from their thought processes’ (Revett, Deravi & Sirlantzis, 2010, p. 71). These
systems establish authentication via biosignals that reflect the mental states of indi-
viduals, as measured by brain-computer interfaces (BCIs) (Jolfaei, Wu & Muthuk-
kumarasamy, 2013). The use of cognitive biometric systems has become the subject
of increasing attention as the technology has continued to develop in recent years
(Jolfaei et al., 2013; Armstrong et al., 2015).
Neural activity can be used as a biometric signature that reflects individual
mental activities or cognitive processes (Tsuru & Pfurtscheller, 2012). Cognitive
biometrics involve the use of an electroencephalogram (EEG) which is non-invasive
and captures electrical signals produced by the firing of neurons within the brain; it
is used in medicine to measure brain function. This can be undertaken when an
individual performs a certain cognitive task, such as visual perception, memory or
language tasks that activate specific regions of the brain and lead to specific patterns
in EEG activity (Revett et al., 2010). When electrical signals are associated with a
New forms of biometric identification 81
References
Abaza, A., Ross, A., Hebert, C., Harrison, M. & Nixon, M. (2013). A survey on ear biometrics.
ACM Computing Surveys 45(2), 1.
Ali, A. & Islam, M. (2013). A biometric based 3D ear recognition system combining local
and holistic features. International Journal of Modern Education and Computer Science 11, 36.
Altaf, M., Butko, T. & Juang, B. (2015). Acoustic gaits: Gait analysis with footstep sounds.
IEEE Transactions on Biomedical Engineering 62(8), 2001.
Anwar, A., Ghany, K. & Elmahdy, H. (2015). Human ear recognition using geometrical
features extraction. Procedia Computer Science 65, 529.
Armstrong, B., Ruiz-Blondet, M., Kahalifian, N., Kurtz, K., Jun, Z. & Laszlo, S. (2015).
Brainprint: Assessing the uniqueness, collectability, and permanence of a novel method
for ERP biometrics. Neurocomputing 166, 59.
Australian Taxation Office. (2014, 8 September). ATO launches voice authentication:
Australians can save time on the phone to the ATO. Retrieved from https://www.ato.
gov.au/media-centre/media-releases/ato-launches-voice-authentication
6
BIOMETRICS IN CRIMINAL TRIALS
Introduction
This chapter explores the ways in which criminal courts have dealt with the
emergence of biometrics as a source of evidence. The main purpose in considering
this kind of evidence in criminal trials and appeals is to establish the identity of
either the offender or the victim. As discussed in previous chapters, fingerprint and
DNA analysis have long been admitted as evidence aiding identification, and these
have been supplemented more recently by facial and body mapping, voice analysis
and other types of biometrics. However, each of these has faced challenges to
acceptance as a form of evidence, based mainly on concerns about their reliability,
regulatory control and the manner of their presentation in legal proceedings. This
chapter provides an insight into the trend of accepting biometric identification as a
source of evidence, but with some judicial reservations about the application of
particular kinds of biometrics in the criminal justice system. These concerns are
largely based on whether certain forms of identification have achieved the degree
of scientific reliability that is required for legal admissibility.
Identification evidence
Before discussing the main forms of biometrics that courts deal with, it is useful to
consider the context of such evidence.1 In criminal trials, the prosecution is
1 In this chapter, the focus is on criminal proceedings. However, biometrics can also play
a role in civil or administrative proceedings. An example is the resolution of paternity
claims in family law: see, for example, the case of Magill v Magill [2006] HCA 51; (2006)
231 ALR 277; (2006) 81 ALJR 254 (9 November 2006) in which DNA testing after
the end of a marriage revealed that two children were not biological offspring of the
father, leading to tortious claims of deceit. Biometrics are also used in migration cases to
86 Biometrics in criminal trials
required to prove its case against the defendant (also referred to as the accused)
beyond reasonable doubt, unless there is a guilty plea and the matter proceeds
directly to sentencing. Where the prosecution is required to prove the identity of
the person who allegedly committed the crime, there will usually be some form of
‘identification evidence’ adduced. An example of this form of evidence is the
following:2
In most cases, the assertion of identity will be made by a person who was an ‘eye
witness’ at the scene of a crime, and has made such a report to police or is able to
do so in court testimony.3 The provisions dealing with identification evidence
impose as a general pre-condition to the admissibility of such evidence that
the defendant participated in an ‘identification parade’ or, as it is also known, a
‘police line-up’.4 This requirement is due to the fact that eye witness identification
has historically been seen as unreliable and has led in some instances to wrongful
convictions, so that more controlled and supervised identification procedures are
preferred.5
help in establishing or verifying identity: see, for example, SZSZM v Secretary, Depart-
ment of Immigration and Border Protection [2017] FCA 458 (27 April 2017).
2 This example is from the uniform evidence law (UEL) that operates in several Australian
jurisdictions.
3 The expression ‘visually, aurally or otherwise’ allows other senses to form the basis of a
witness identification, such as recognising a distinctive voice, accent, posture or gait. A
recent case in which a ‘voice identification parade’ was used is Miller v R [2015]
NSWCCA 206 (3 August 2015).
4 Section 113 provides that the identification evidence provisions only apply in criminal
proceedings. Sections 114 and 115 refer to the use of an ‘identification parade’ but do
not define the term. Section 114 deals with ‘visual identification evidence’ while s115
deals with ‘picture identification evidence’. The conduct of identification parades is
governed by other legislation such as the Crimes Act 1914 (Cth), ss3ZM and 3ZN.
Section 116 imposes requirements for warnings to the jury in relation to identification
evidence.
5 The High Court of Australia summarised the problems with eye witness identification
almost a century ago, and noted requirements for identification procedures that are still
Biometrics in criminal trials 87
a relevant in the proceeding, meaning that it has the capacity to help resolve
factual issues in the trial, such as the identity of an offender;7
b based on specialised knowledge, meaning that it must be presented by an
expert who has previous training, study or experience in the applicable field of
expertise;8
c not unfairly prejudicial, in which case it may be ruled inadmissible by the
judge.9
followed today: Davies (and Cody) v The King [1937] HCA 27; (1937) 57 CLR 170; see
also Alexander v The Queen [1981] HCA 17; (1981) 145 CLR 395.
6 Biometric identification such as a fingerprint or DNA match is not classed as ‘identifi-
cation evidence’ because it is usually not based on what ‘the person making the assertion
saw, heard or otherwise perceived at that place and time’ but on later forensic analysis
by a person who was not a witness to the events in question: see Australian Law
Reform Commission, Uniform Evidence Law (ALRC 102), [13.25]. This means that Part
3.9 does not apply, and biometric evidence is treated as a form of circumstantial evi-
dence; for example, the judge in R v Pfennig (No. 2) [2016] SASC 171 (11 November
2016), [31] stated: ‘I point out, however, that the DNA evidence is not direct evidence
going to the guilt of the accused. I treat it as circumstantial evidence to be considered
alongside all of the other evidence in the case’.
7 Section 55(1) of the UEL legislation provides: ‘The evidence that is relevant in a pro-
ceeding is evidence that, if it were accepted, could rationally affect (directly or indir-
ectly) the assessment of the probability of the existence of a fact in issue in the
proceeding’. Relevant evidence is admissible subject to other provisions: s56.
8 Section 79(1) provides an exception from the exclusionary opinion rule in s76 as fol-
lows: ‘If a person has specialised knowledge based on the person’s training, study or
experience, the opinion rule does not apply to evidence of an opinion of that person
that is wholly or substantially based on that knowledge’. An expert may give this evi-
dence in the form of affidavit under s177, or may be called to give the evidence
through testimony.
9 In particular, s137 provides: ‘In a criminal proceeding, the court must refuse to admit
evidence adduced by the prosecutor if its probative value is outweighed by the danger
of unfair prejudice to the defendant’. Additionally, s138(1) provides: ‘Evidence that was
obtained: (a) improperly or in contravention of an Australian law; or (b) in consequence
of an impropriety or of a contravention of an Australian law; is not to be admitted
unless the desirability of admitting the evidence outweighs the undesirability of admit-
ting evidence that has been obtained in the way in which the evidence was obtained’.
Thus, investigative practices may also affect admissibility.
88 Biometrics in criminal trials
turn below. However, as discussed in Chapter 5, new techniques are always evol-
ving and this list is not exhaustive.
Significant cases
A noteworthy early case involving this form of evidence that was widely publicised
in the United Kingdom and Australia was the murder trial arising from the dis-
appearance in the Northern Territory of British tourist Peter Falconio, whose body
was never found (Gans, 2007c). Part of the evidence was a photographic image
developed from security camera footage at a highway truck stop, which was com-
pared by a facial mapping expert called by the prosecution with images of the
defendant. This evidence was allowed by the trial judge in the case, along with
DNA evidence linking the defendant to the crime:11
10 Smith v The Queen [2001] HCA 50; (2001) 206 CLR 650, in which a High Court
majority observed: ‘Because the witness’s assertion of identity was founded on material
no different from the material available to the jury from its own observation, the wit-
ness’s assertion that he recognised the appellant is not evidence that could rationally
affect the assessment by the jury of the question we have identified. The fact that
someone else has reached a conclusion about the identity of the accused and the person
in the picture does not provide any logical basis for affecting the jury’s assessment of the
probability of the existence of that fact when the conclusion is based only on material
that is not different in any substantial way from what is available to the jury.’
11 The Queen v Murdoch [2005] NTSC 78 (15 December 2005), (Martin CJ), [207]-[208].
DNA aspects of the case are discussed in two articles by Jeremy Gans, ‘The Peter Fal-
conio Investigation: Needles, Hay and DNA’ (2007c) and ‘Catching Bradley Murdoch:
Tweezers, Pitchforks and the Limits of DNA Sampling’ (2007a).
Biometrics in criminal trials 89
The image of the person entering the shop at the truck shop taken from the
security film is far from clear. This is not a case of comparing clear photo-
graphs where it could be said with considerable force that the jury could reach
its own conclusion without help. In addition, there is evidence that the
accused has changed his appearance since July 2001. The comparison between
the image from the security film and photographs of the accused is far from
straightforward and, in my opinion, the jury would be assisted by the evidence
of Dr Sutisno.
Further, in my view, it is not appropriate to limit the assistance to merely
identifying the relevant characteristics. When regard is had to the nature and
detail of the characteristics and the methodology employed by Dr Sutisno, it is
readily apparent that her knowledge and expertise in the area of anatomy give
Dr Sutisno a significant advantage in the assessment of the significance of the
features of comparison both individually and in their combination. Dr Sutisno
possesses scientific knowledge, expertise and experience outside the ordinary
knowledge, expertise and experience of the jury. This is not a case in which
the jury, having been informed of the relevant features, would not be assisted
by the expert evidence of Dr Sutisno as to her opinion of the significance of
the features individually and in their combination.
The court was also prepared to accept body mapping, a more recent technique
involving superimposition of images, as an extension of facial mapping:12
Body mapping has received limited attention within the scientific community.
For that reason it may be regarded as a new technique, but as Dr Sutisno
explained it is merely an extension of the well recognised and accepted prin-
ciples of facial mapping to the remainder of the body. I am satisfied that the
technique has ‘a sufficient scientific basis to render results arrived at by that
means part of a field of knowledge which is a proper subject of expert
evidence’.
However, on appeal it was held that the facial and body mapping evidence should
not have been allowed beyond the expert assisting the jury to ascertain physical
similarities, rather than in the expert reaching conclusions about identity:13
This Court has found that the technique employed by Dr Sutisno did not
have a sufficient scientific basis to render the results arrived at by that means
part of a field of knowledge which is a proper subject of expert evidence.
However the evidence given by Dr Sutisno was capable of assisting the jury in
terms of similarities between the person depicted in the truck stop footage and
the appellant. It was evidence that related to, and was admissible as, demon-
strating similarities but was not admissible as to positive identity. Dr Sutisno
was not qualified to give evidence, as she did, based on “face and body map-
ping” as to whether the two men were, indeed, the same man. Her evidence
in this regard should not have been received.
Facial and body mapping evidence can therefore be admitted in criminal proceed-
ings, but its use must be managed so as not to usurp the function of the jury as
decider of the facts. Two other cases decided at around the same time reached a
similar conclusion, though with some additional differentiation between facial and
body mapping. In the case of Tang, the Court of Criminal Appeal considered the
expert’s use of biometric methods:14
The court went on to consider similarities between these techniques and other
biometrics such as fingerprint comparison, a more established method of forensic
identification. By analogy, it was accepted that expert evidence of similarities,
derived from comparison of facial or body photographs could also provide assis-
tance to the jury, including acquired or ‘ad hoc’ expertise:15
14 R v Hien Puoc Tang [2006] NSWCCA 167 (24 May 2006), [19]-[20] (Spigelman CJ).
15 R v Hien Puoc Tang [2006] NSWCCA 167 (24 May 2006), [120] (Spigelman CJ). The
concept of ad hoc expertise in relation to voice identification has been applied in cases
such as Butera v Director of Public Prosecutions (Vic) [1987] HCA 58; (1987) 164 CLR 180;
R v Leung and Wong [1999] NSWCCA 287; and more recently, Morgan v R [2016]
NSWCCA 25 (26 February 2016); and Nasrallah v R; R v Nasrallah [2015] NSWCCA
188 (17 July 2015).
Biometrics in criminal trials 91
The process of identification and magnification of stills from the videotape was
a process that had to be conducted by Dr Sutisno out of court. Furthermore,
the quality of the photographs derived from the videotape was such that the
comparison of those stills with the photographs of the Appellant could not be
left for the jury to undertake for itself. The identification of points of similarity
by Dr Sutisno was based on her skill and training, particularly with respect to
facial anatomy. It was also based on her experience with conducting such
comparisons on a number of other occasions. Indeed, it could be supported by
the experience gained with respect to the videotape itself through the course
of multiple viewing, detailed selection, identification and magnification of
images. By this process she had become what is sometimes referred to as an
“ad hoc expert”.
However, in order for the opinions of identity offered by the expert in this case to
be admissible, compliance with the specialised knowledge requirements of evi-
dence law had to be demonstrated. The court ruled that there was an inadequate
connection between the body mapping techniques being applied and the ‘training,
study or experience’ of the expert, and thus the opinions on offer did not pass the
requirements of the relevant evidence law:16
In the case of the Appellant the relevant evidence about posture was expressed
in terms of “upright posture of the upper torso” or similar words. The only
links to any form of “training, study or experience” was the witnesses’ study of
anatomy and some experience, entirely unspecified in terms of quality or
extent, in comparing photographs for the purpose of comparing “posture”.
The evidence in this trial did not disclose, and did not permit a finding, that
Dr Sutisno’s evidence was based on a study of anatomy. That evidence barely,
if at all, rose above a subjective belief and it did not, in my opinion, manifest
anything of a “specialised” character. It was not, in my opinion, shown to be
“specialised knowledge” within the meaning of s79.
In the Jung case a month later, a judge was again required to rule on the admissi-
bility of Dr Sutisno’s facial and body mapping analysis in a murder trial. In this
case, the defence called its own expert witnesses, who cast doubt on the claims
made for the techniques, referring to the quality of the photographs used. None-
theless, the judge ruled the evidence admissible, with questions of the quality of the
analysis going to its weight rather than admissibility:17
16 R v Hien Puoc Tang [2006] NSWCCA 167 (24 May 2006), [140] (Spigelman CJ,
Simpson and Adams JJ agreeing).
17 R v Jung [2006] NSWSC 658 (29 June 2006), [62]-[64].
92 Biometrics in criminal trials
establish that she has failed to disclose the factual material she has utilised (the
photographic images), the nature of the methodology that she has employed
and the type of analysis described in her reports (morphological analysis). I
have carefully reviewed the reports and her evidence in order to determine
whether it may properly be said that, having regard to the specific principles
governing admissibility of expert evidence as identified by Heydon, JA in
Makita … Dr. Sutisno’s evidence complies with the requirements for
admissibility.
Insofar as she has identified the relevant factual matters that she has taken
into account (the particular photographic images) the particular facial features
which she maintains are examinable by reference to such images and the
nature of the methodology employed by her, the tests of admissibility in those
respects are satisfied. The question of the weight, including the reliability, of
the opinion is, of course, a quite different matter and it is anticipated at trial
that attention will be given to the quality of the photographic images, their
alleged deficiencies and the significance that arises from those matters.
Another case to examine the scientific reliability of the technique of body mapping
was based on a comparison, of both moving and still images of an offender and the
defendant, by an expert in the field of anthropology and comparative anatomy. In
overturning the conviction in this case on appeal, the court expressed its concern
about the ‘lack of research into the validity, reliability and error rate of the pro-
cess’.18 Thus, the scientific reliability of body mapping has not been definitively
resolved.
Three years later in 2014, similar evidence was considered in the Honeysett case
(discussed in Buckland, 2014; Edmond & San Roque, 2014). The opinion of the
expert, based on body mapping analysis in an armed robbery case, identified the
appellant:19
Although most of the body of the offender is covered by clothing, head wrap
and gloves, an area of naked skin above his wrist (between the glove and the
sleeve) in images … is visible and can be compared to the skin colour of a
female hotel employee on the same images.
[The appellant] is an adult male of ectomorphic (= slim) body buil[d]. His
hips and shoulders are of approximately the same width. His stance is very
straight with well marked lumbar lordosis and pelvis shifted forward. His
skull vault is dolichocephalic when viewed from the top. Comparison of
lateral (side) and front views of his head also indicates the head … is long but
narrow. His skin is dark, darker than that of persons of European extraction,
but not ‘black’ … He is right-handed – uses his right hand to sign
documents.
The expert concluded that there was a ‘high degree of anatomical similarity’
between the offender and the appellant, and this opinion was ‘strengthened by the
fact that he was unable to discern any anatomical dissimilarity between the two
individuals’. This evidence was allowed to be heard by the jury, which convicted
the appellant. On appeal, the court held that the evidence fell within the ‘training,
study or experience’, of the expert witness.20 The appeal was dismissed and the
matter went before a High Court for further consideration.
The High Court unanimously agreed that, whatever the scientific merits of body
mapping as a reliable and validated field of study, the expert’s opinion in this case
was simply not sufficiently based on his expertise in anatomy:21
20 Honeysett v R [2013] NSWCCA 135 (5 June 2013) (Macfarlan JA, Campbell J and Barr
AJ agreeing).
21 Honeysett v The Queen [2014] HCA 29 (13 August 2014), [43]-[46] (French CJ, Kiefel,
Bell, Gageler and Keane JJ). The appellant’s conviction was ordered to be quashed and a
new trial allowed.
94 Biometrics in criminal trials
Facial mapping has been accepted as a form of biometric evidence, though with
some reservations about the strength of expert opinions in particular cases. Body
mapping has not been definitively accepted as scientifically reliable, and the few
cases in which it has been considered in depth have cast doubt on the reasoning
processes involved.
The courts’ treatment of facial and body mapping as fields of ‘specialised
knowledge’ has been criticised. In relation to the Honeysett case, Edmond and San
Roque (2014, p. 324) have argued:
Nonetheless, there is merit in challenging the scientific basis of new forensic tech-
niques such as facial and body mapping, in order to ensure that the best evidence is
presented before the courts. While this may not result in exclusion of expert opi-
nion evidence, it may affect the weight it is given in the overall context of criminal
proceedings.
Fingerprinting
As discussed in Chapter 2, fingerprinting has been routinely used by police in
criminal investigations since the beginning of the 1900s (Coyle, Field & Wender-
oth, 2009; Gans, 2011). Crime scene examiners may find ‘latent’ fingerprints or
palm prints on objects, which can be visualised using laboratory processes. The
prints can then be compared with those taken from a suspect or by searching for a
match against a database of prints. This can be done in an automated way, for
example, using the IDENT1 national fingerprint database that operates in the
United Kingdom.
Biometrics in criminal trials 95
Courts around the world have routinely admitted fingerprint evidence in crim-
inal proceedings for over a century.22 Typically, the expert witness in such cases is
an investigating police officer with specialised knowledge of fingerprinting techni-
ques, or a forensic analyst, who was involved in the fingerprint collection and
comparison process used in the investigation.23
22 In a 1912 case, it was observed: ‘Signatures have been accepted as evidence of identity
as long as they have been used. The fact of the individuality of the corrugations of the
skin on the fingers of the human hand is now so generally recognised as to require very
little, if any, evidence of it, although it seems to be still the practice to offer some expert
evidence on the point. A finger print is therefore in reality an unforgeable signature’:
Parker v R [1912] HCA 29; (1912) 14 CLR 681, Griffith CJ at 683, cited in R v Mitchell
[1997] ACTSC 93; (1997) 130 ACTR 48 (18 November 1997).
23 See, for example, the cases of R v Regan [2014] NSWDC 118 (16 June 2014); and DPP
v Watts [2016] VCC 1726 (23 November 2016).
24 Part ID the Crimes Act 1914 (Cth). Taking a fingerprint is classified as a ‘non-intimate
forensic procedure’ which can be carried out with consent or by order of a senior police
officer or magistrate on person in custody where other conditions are satisfied.
96 Biometrics in criminal trials
Police taking fingerprints under this provision may do so with or without the
consent of the suspect. However, if there is a failure of compliance with the
requirements of this section, or others that relate to the treatment of persons in
custody and the taking of forensic samples, the defence is entitled to challenge the
admissibility of the evidence based on the manner in which it was obtained. An
example is a case involving the North Korean transport ship Pong Su, in which
fingerprints of a suspect were taken by police officers. The defence argued that the
circumstances in which the fingerprints were taken were oppressive in that the
suspect ‘had been exposed to the elements for two days prior to being taken into
custody during which time he had no access to food and limited access to water
and was found by police to be tired’. It was also submitted that the fingerprints had
been illegally obtained due to non-compliance with the above provision (s3ZJ).
The judge, however, found that the police officers had acted reasonably and in good
faith, and that ‘[a]t the most any breach was a failure to comply with a procedural
requirement’ that did not require exclusion of the evidence.26
25 Subsections dealing with persons under the age of 18 years are not reproduced here.
Note that more restrictive conditions may apply to minors: R v SA, DD and ES [2011]
NSWCCA 60 (28 March 2011); Hibble v B [2012] TASSC 59 (20 September 2012).
See also Watkins v The State of Victoria & Ors [2010] VSCA 138 (11 June 2010), which
considered whether police had used excessive force in taking fingerprints from a suspect.
26 Pong Su (No. 2) [2004] VSC 492 (6 December 2004), per Kellam J at [31].
Biometrics in criminal trials 97
The real strength of the Crown case lay in the fingerprint evidence. Ms Lam, a
crime scene investigator, attended the scene at about 4.45 pm. She found a
number of fingerprints, including some left on the television set, and both
photographed them and took tape lifts from them. Mr Comber, a fingerprint
expert, gave evidence that he had compared a fingerprint lifted from the
television set with a fingerprint identified as that of the accused on the
National Automated Fingerprint Identification System (‘NAFIS’). He found
that the two prints had both been made by the middle finger of the same left
hand. There was no challenge to Mr Comber’s methodology or as to the
accuracy of this conclusion. I found him to be an impressive witness and
accepted his evidence. It was not suggested that the fingerprint obtained from
NAFIS had been incorrectly attributed to the accused and I was satisfied
beyond reasonable doubt that the print had been left on the television set
when touched by the accused.
The probative value of a fingerprint or palm print match must be assessed in the
context of all other evidence in a criminal trial, and it will be of greatest sig-
nificance if there is no apparently innocent explanation for how it came to be left
at a crime scene.28 This kind of evidence therefore operates as part of a circum-
stantial case against the defendant.
The following example of a fingerprint comparison report tendered during
police testimony relates to a match of prints left during a burglary, and a young
defendant identified as ‘JP’:29
27 R v Millard [2006] ACTSC 56 (6 June 2006), [15]. See also R v Fitzgerald [2005] SADC
118 (25 August 2005).
28 An unusual case where the defence sought to have fingerprint evidence excluded
entirely was an appeal in which the defence alleged that police had forged the defen-
dant’s fingerprint on a cheque: Mickelberg v The Queen [2004] WASCA 145 (2 July
2004).
29 JP v Director of Public Prosecutions (NSW) [2015] NSWSC 1669 (11 November 2015),
[7]. The police witness had prepared a ‘Certificate of Expert Evidence’ under s 177 of
the UEL legislation, stating his qualifications as an examiner and presenting his
conclusions.
98 Biometrics in criminal trials
During the course of my daily duties, I carefully compared all the finger and
palm impressions appearing in the photographs bearing Forensic Case Number
2819499 with the finger and palm impressions of [JP] born … as appearing on
the fingerprint form by placing those photographs one at a time side by side
with those finger and palm impressions and referring backwards and forwards
between them. I compared pattern type and ridge flow, friction ridge char-
acteristics, their relative positions to each other and the number of intervening
ridges between those characteristics, that is the finger or palm prints appearing
in the photographs bearing Forensic Case Number 2819499 against the finger
or palm impressions of [JP] born … as appearing on the fingerprint form. The
comparison process was carried out systematically and sequentially until all
available friction ridge detail had been compared between the finger and palm
impressions appearing in the photographs bearing Forensic case Number
2819499 and the finger and palm impressions of [JP] born … as appearing on
the fingerprint form.
30 JP v Director of Public Prosecutions (NSW) [2015] NSWSC 1669 (11 November 2015),
[36] referring to ‘the case of Bennett v Police [2005] SASC 167 (4 May 2005) in which
“more than 20 characteristics … were common and identical”. In JP, the police witness
claimed to have examined 35 comparison points but did not specify how many were
considered to be a match with the defendant’s prints, as opposed to the overall con-
clusion of identity.
Biometrics in criminal trials 99
the opinions arrived at in the case.31 Where there are gaps in the explanations
offered by the prosecution’s experts, defence counsel may seek to have the opinion
evidence excluded entirely, or ask for the jury to be cautioned in giving it weight
as evidence.32 It is also possible at the appeal stage for an appellant to argue that the
fingerprint or other biometric evidence was not properly summarised by the judge
in instructing the jury.33
It may also be possible to challenge inferences drawn from physical evidence, such
as the estimated age of fingerprints. The time at which fingerprints were deposited
through contact with an object may be of importance in assessing its relevance in a
particular case. This will ordinarily involve additional forensic evidence.34
Another issue that judges must consider carefully is that a jury hearing that the
defendant’s fingerprints were matched to a crime scene using a police database may
infer that the defendant has a criminal history, which explains the inclusion on the
database. In such cases, the defence may seek to exclude evidence as unfairly pre-
judicial, or seek that the jury be discharged. A remedy is for the judge to warn the
jury against making an adverse inference of this kind.35
DNA identification
Identification using DNA is generally regarded as more discriminating than any other
biometric method. However, because it relies on the generation of a DNA profile from
a biological sample, it can be susceptible to court challenges, for example, on the basis
of sample integrity and the possibility of transference. Further complexities include the
scientific processes and statistical interpretations involved (Gans & Urbas, 2002).
Because DNA profiles are stored in increasingly large databases, new matching techni-
ques allowing ‘cold hits’ and partial match searches are now used routinely (Smith &
Mann, 2015). These issues will be discussed in turn, drawing on illustrative cases.
31 Leading authorities on specialized knowledge under UEL s79(1) are Makita (Australia)
Pty Ltd v Sprowles [2001] NSWCA 305 (14 September 2001); HG v The Queen [1999]
HCA 2; 197 CLR 414; and Honeysett v The Queen [2014] HCA 29; 253 CLR 122.
32 In JP v Director of Public Prosecutions (NSW) [2015] NSWSC 1669 (11 November 2015),
defence arguments seeking exclusion of the fingerprint comparison evidence on the
basis that the expert had insufficiently explained his reasoning process were unsuccessful.
The judge noted at [43] that ‘with fingerprint evidence it will often be the case that
“little explicit articulation or amplification” of how the stated methodology warrants the
conclusion that two fingerprints are identical will be required before it can be con-
cluded that the second condition of admissibility under s 79(1) has been satisfied’
(emphasis original), citing Dasreef Pty Ltd v Hawchar [2011] HCA 21; 243 CLR 588.
33 Ghebrat v The Queen [2011] VSCA 299 (12 October 2011).
34 See R v SMR [2002] NSWCCA 258 (1 July 2002).
35 See, for example, the defence submission in R v Ahola (No. 6) [2013] NSWSC 703 (14
May 2013), [3]: ‘The submission is that the jury would inevitably infer from the [police
officer’s testimony] that the accused is a person with a criminal record whose finger-
prints were held by the police, prior to them being taken from him with regard to this
matter. It is that inference that forms the foundation for the application of the discharge
of the whole jury’. The submission was unsuccessful in this case.
100 Biometrics in criminal trials
36 See, for example, Walker v Budgen [2005] NSWSC 898 (7 September 2005); and Hibble
v B [2012] TASSC 59 (20 September 2012) dealing with a DNA sample taken from a
13-year old suspect.
37 R v Lisoff [1999] NSWCCA 364 (22 November 1999). The court ruled that the matter
was one that could be put before a jury for resolution, rather than require exclusion on
the grounds of unfair prejudice to the defendant: “There is nothing so extraordinary
about the conflict in the evidence presented in this case which would justify the con-
clusion that a careful and sensible jury, properly directed as to the relevant law and as to
the relevant evidence, could not decide in a reasoned and responsible way whether or
not the Crown had demonstrated beyond reasonable doubt that the body of evidence
supporting the Crown case should be preferred to the opposed body of evidence” [64].
Biometrics in criminal trials 101
There is nothing in the evidence to exclude the possibility that the children
may have had some of the appellant’s DNA transferred to their sleeves or
other parts of their clothing when they hugged him at the end of a week spent
in his care, and then subsequently hugged their mother in a similar manner.
Nor, is there any reason to suppose that DNA left on their clothing after
contact with the appellant might not have been transferred to the deceased’s
pyjamas at some later stage when she had been handling that clothing.
This was the basis on which the murder conviction was quashed. However, on
appeal by the prosecution it was overturned and a re-hearing of the appeal was
ordered. This second appeal ordered a re-trial, at which the defendant elected to be
tried by judge alone, rather than before a jury as in the first trial, and he was
acquitted.39
In a more recent case, Fitzgerald, the transference problem was again raised by
the defence in a murder trial. On the prosecution’s case, the defendant’s DNA was
found on an object, a didgeridoo, in the house at which a fatal assault took place.
Because the possibility of secondary transfer could not be ruled out, the court
ultimately allowed an appeal and ordered that a verdict of acquittal be entered.40
In the 2013 case Maryland v King, the US Supreme Court upheld the use of
DNA sampling in the criminal justice system against the Fourth Amendment of the
US Constitution, which prohibits unreasonable searches and seizures, and requires
warrants to be issued by a judge and supported by probable cause. King argued that
the Maryland DNA Collection legislation violated the Fourth Amendment to the
US Constitution. Although the Maryland Court of Appeals found the legislation
was unconstitutional, the US Supreme Court held that taking DNA is a legitimate
procedure to identify arrestees.
38 Hillier v R [2005] ACTCA 48 (15 December 2005), (Higgins CJ and Crispin P), [60].
39 The High Court appeal was R v Hillier [2007] HCA 13 (22 March 2007); which was
followed by re-heard appeal in Hillier v R [2008] ACTCA 3 (6 March 2008); the final
acquittal is unreported.
40 Fitzgerald v The Queen [2014] HCA 28 (13 August 2014).
102 Biometrics in criminal trials
The majority opinion considered that the Fourth Amendment permits police to
undertake ‘routine identification processes’ in relation to arrestees,41 including
photographing and fingerprinting arrestees as part of the associated administrative
process.42 Further, that this is part of a legitimate ‘need for law enforcement officers
in a safe and accurate way to process and identify the persons and possessions they
must take into custody’43 and that DNA sampling is an extension of these more
established methods.44 Further, it considered that the cheek swap used to collect bio-
logical material was ‘quick’, ‘painless’ and ‘no more invasive than fingerprinting’.45
According to the dissenting view in the case, the Supreme Court’s finding pro-
motes the collection of DNA by police from individuals that have not committed
serious offences, or are even arrestees. Justice Scalia opined that the approach is a
shift towards a ‘genetic panopticon’46 and ‘[n]o matter the degree of invasiveness,
suspicionless searches are never allowed if their principal end is ordinary crime-
solving’.47
Roth (2013, p. 298) argues that by running an arrestee’s DNA profile against a
database, seeking a ‘cold hit’ against DNA collected at the scenes of unsolved
crimes, rather than a database of known offenders to establish his identity ‘suggests
that the state contemplates the arrest as a proxy for criminality rather than as a
means of covering all those in custody whose identification needs confirmation’.
Scientific basis
The science underlying DNA identification has been extensively assessed in crim-
inal proceedings around the world since the late 1990s. In a 2001 case that provides
a representative example, the Profiler Plus DNA matching technology based on
Polymerase Chain Reaction (PCR) analysis, that had been in use for over a decade,
was found to be sufficiently accepted within the scientific community to be a valid
means of identification in criminal trials. The judge stated:48
The evidence in the present case was clear and, in my view, overwhelming.
Whilst the Profiler Plus system is relatively new, it utilizes familiar technology
for amplification and inspection of STR loci which technology is widely,
almost universally, accepted in the relevant scientific community as reliable
and accurate. The variations fundamental to the Profiler Plus system, namely
the particular loci and the number of them, the new primer sequences if they
are new, and the use of Genotyper, have clearly been shown to have been
accepted by the relevant scientific community as accurate and reliable. … The
evidence overwhelmingly established that the Profiler Plus system is generally
accepted throughout the forensic science community as reliable and accurate
in DNA analysis for the purposes of human identification, including with low
levels of DNA.
In many countries around the world, the main criteria for admissibility of opinion
evidence from experts are those found in the ‘specialised knowledge’ provisions of
evidence law rather than scientific criteria of reliability. This has developed from
the US case Daubert v Merrell Dow Pharmaceuticals, Inc.49 The Daubert standard is that
there be a field of specialised knowledge, that the witness have such knowledge
based on training, study or experience and that the opinions of the witness be
wholly or substantially based on this. The forensic use of DNA in criminal inves-
tigations is now routinely accepted as a field of specialised knowledge (Gans &
Urbas, 2002; Smith & Mann, 2015).
The first stage of DNA identification involves the generation of a profile from a
crime scene and its comparison with the defendant’s profile. The legal significance
of the presence or absence of a match has been explained as follows:50
49 Honeysett v The Queen [2014] HCA 29 (13 August 2014). The Daubert case is the US
Supreme Court decision of Daubert v Merrell Dow Pharmaceuticals, Inc [1993] USSC 99;
509 U.S. 579; 113 S.Ct. 2786; 125 L.Ed.2d 469; No. 92–102 (28 June 1993).
50 Aytugrul v R [2010] NSWCCA 272 (3 December 2010), [80] (McClellan CJ at CL)
citing Sulan J in R v Carroll [2010] SASC 156 (28 May 2010), [28].
104 Biometrics in criminal trials
In other words, a DNA match only provides a link between the defendant and a
crime on a probabilistic basis, whereas a non-match will exclude identification
conclusively (Gans & Urbas, 2002). The significance of such a match will depend
on the context and other evidence. However, this must be because any innocent
explanation for the presence of the defendant’s DNA at the crime scene, or on the
body of a sexual assault victim, is excluded (Julian & Kelty, 2012; Julian et al.,
2012). The analysis may be complicated further where the crime scene sample
contains ‘mixed profiles’ indicating that it contains the DNA of more than one
individual.51
Where there is considerable room for interpretation is in the significance and
proper presentation of the statistical analysis that accompanies a DNA match
(Goodman-Delahunty & Tait, 2006). This is because tools such as Profiler Plus
only use a fixed number of markers or loci from the non-coding genetic
sequences that are used in generating DNA profiles, meaning that two different
individuals could have the same profile within these parameters. This then
allows analysts to say that a match was found, such as between a crime scene
sample and one taken from the defendant during investigation, and to state the
approximate probability of this match being a result not of commonality of
origin but of a random match. This is typically expressed as a random match
probability relative to the general population or a subset of it, based on a
representative sample.
The composition and size of the sample used may be significant in supporting
the inferences to be drawn. In practice, population sample databases of only a few
hundred are accepted by the courts as being sufficiently discriminating to allow
valid statistical inferences to be drawn:52
Databases have been built up by which the probability that the DNA of
another person within the general population would match the DNA of the
deceased at particular genetic markers may be estimated … It is accepted that
the precision of the figures produced from any data base is dependent upon
the size of the sample; the larger the sample, the greater the precision in the
figures produced. The database for the RFLP results was compiled from the
testing of 500 people who had donated blood at the Red Cross Blood
Bank … The statistical validity of databases compiled from as low as 100 to
150 people is supported by a number of eminent scientists and scientific
bodies.
In the Pantoja case, a question arose about the appropriateness of using a general
database of profiles taken from a multicultural society where a majority of the
51 Tuite v The Queen [2015] VSCA 148 (12 June 2015); and R v Xie (No. 18) [2015]
NSWSC 2129 (28 July 2015).
52 R v Milat (1996) 87 A Crim R 446; see also R v To [2002] NSWCCA 247 (26 June
2002).
Biometrics in criminal trials 105
adults were of white European ethnicity, when the defendant was a member of a
distinctive group (identified as South American Quechua Indians).53 The court
held that this did not matter, as it was the racial characteristics of the (unknown)
offender that were relevant to the appropriateness of the statistical database, rather
than the ethnicity of the defendant. However, the statistical validity of the database
still had to be established, which led to a successful appeal, a re-trial and a second
appeal in which the conviction was finally affirmed.54
Juror comprehension
A frequently discussed question is whether juries are capable of understanding
complex scientific information such as biometric identification technology, and if they
are to be required to evaluate such evidence, the forms in which it should be presented
so as to best facilitate comprehension (Goodman-Delahunty & Wakabayashi, 2012). A
starting point is the view that complexity alone should not preclude scientific evidence
from being heard by a jury:55
Juries are frequently called upon to resolve conflicts between experts. They
have done so from the inception of jury trials. Expert evidence does not, as a
matter of law, fall into two categories: difficult and sophisticated expert evi-
dence giving rise to conflicts which a jury may not and should not be allowed
to resolve; and simple and unsophisticated expert evidence which they can.
Nor is it the law, that simply because there is a conflict in respect of difficult
and sophisticated expert evidence, even with respect to an important, indeed
critical matter, its resolution should for that reason alone be regarded by an
appellate court as having been beyond the capacity of the jury to resolve.
conviction on appeal, on the basis that the judge had committed the prosecutor’s
fallacy in summing up the evidence to the jury.56
With regard to DNA match probabilities, it has also been argued that mathe-
matically equivalent ways of expressing the same information can have different
levels of persuasiveness to a jury. In the case of Aytugrul, the following evidence
was at issue (Urbas, 2012):57
The defence sought to argue that there was unfair prejudice in putting the per-
centage before the jury, as this was overly persuasive and invited a subconscious
‘rounding up’ to 100 per cent certainty. However, this was not accepted by the
relevant court given the expert’s explanations:58
The unfair prejudice said to arise in this case was alleged to flow from the use
of a percentage figure, which carried a “residual risk of unfairness deriving
from the subliminal impact of the raw percentage figures” by way of rounding
up the percentage figure to 100. If the exclusion percentage were to be
examined in isolation, the appellant’s arguments appear to take on some force.
But to carry out the relevant inquiry in that way would be erroneous. In this
case, both the frequency ratio and the manner in which the exclusion per-
centage had been derived from the frequency ratio were to be explained in
evidence to the jury. The risk of unfair prejudice – described by the appellant
as the jury giving the exclusion percentage “more weight … than it
deserved” – was all but eliminated by the explanation.
56 R v Keir [2002] NSWCCA 30 (28 February 2002). The defendant was convicted on the
re-trial, and an appeal against that conviction was unsuccessful: Keir v R [2007]
NSWCCA 149 (6 June 2007). The prosecutor’s fallacy was also discussed in Aytugrul v
R [2010] NSWCCA 272 (3 December 2010).
57 Aytugrul v The Queen [2012] HCA 15 (18 April 2012), (French CJ, Hayne, Crennan and
Bell JJ), [2] (note omitted after the words ‘frequency ratio’, as follows: ‘Sometimes called
a “random occurrence ratio” or a “frequency estimate”’). Heydon J agreed with the
majority in a separate judgment.
58 Aytugrul v The Queen [2012] HCA 15 (18 April 2012), (French CJ, Hayne, Crennan and
Bell JJ) [30] (note omitted).
Biometrics in criminal trials 107
The concurring judgment in Aytugrul suggested that the jury could be trusted to
work out the statistical issues, even though they were difficult:59
In a 2015 case, the question arose whether the scientific reliability of a new statistical
technique applied to the analysis of small amounts of DNA used in matching was a
matter going to the admissibility of expert evidence. The court held that it does
not, but rather affects the probative value of the evidence and the potential pre-
judicial effect that its presentation to the jury may have. In this case, the appeal
judges agreed with the conclusions reached by the trial judge in relation to both
aspects of the evidence, holding that the probative value of the evidence was not
outweighed by the alleged prejudicial effect:60
59 Aytugrul v The Queen [2012] HCA 15 (18 April 2012), [75] (Heydon J).
60 Tuite v The Queen [2015] VSCA 148 (12 June 2015), [122]-[124].
108 Biometrics in criminal trials
Moreover, one of the dangers associated with DNA evidence, is what has
come to be known as the ‘CSI effect’. The ‘CSI effect’ is a reference to the
atmosphere of scientific confidence evoked in the imagination of the average
juror by descriptions of DNA findings. As we have explained, as a matter of
pure logic, the DNA evidence has little or no probative value. By virtue of its
scientific pedigree, however, a jury will likely regard it as being cloaked in an
unwarranted mantle of legitimacy – no matter the directions of a trial
judge – and give it weight that it simply does not deserve. The danger of
unfair prejudice is thus marked, and any legitimate probative value is, at best,
small.
DNA databases
As increasing numbers of DNA profiles have been collected during criminal
investigations, these have been stored on police databases, leading to the possibility
of repeated use including by searching for a ‘cold hit’ between a crime scene
sample and a profile already added to the database.64 Forensic databases have
61 The ‘CSI effect’ has also been referred to as the ‘white coat’ effect: Morgan v R [2011]
NSWCCA 257 (1 December 2011), [145], cited in R v MK [2012] NSWCCA 110 (4
June 2012).
62 DPP v Wise (a pseudonym) [2016] VSCA 173 (21 July 2016), [70]; DPP v Massey (a
pseudonym) [2017] VSCA 38 (6 March 2017), (Weinberg JA), [24].
63 R v Drummond (No. 2) [2015] SASCFC 82 (5 June 2015).
64 See, for example, Sleiman v Murray [2009] ACTSC 82 (15 July 2009); and R v Smith [No
1] [2011] NSWSC 725 (26 May 2011).
Biometrics in criminal trials 109
become a powerful investigative tool (Smith, 2016). Not surprisingly, then, the
conditions under which a defendant’s profile may have been obtained, stored
and retained on a DNA database have become the subject of scrutiny in
criminal trials.
Forensic procedures legislation governs how forensic samples stored in DNA
databases may be used. The legislation distinguishes between volunteers, arrested
persons and convicted persons, with different requirements applying to each class.
Permissible matching is regulated by matching tables in the legislation. Finally,
requirements relating to use and removal of profiles from forensic databases are set
out in detail. The failure of police or other officials to comply with the require-
ments of forensic procedures legislation can readily lead to exclusion of evidence
obtained from a stored DNA profile.65
The unlawful retention of DNA profiles on databases was at issue in the landmark
Marper case in the United Kingdom.66 Two individuals, one of them a 12-year-
old, whose profiles had been entered on the database when they were arrested for a
reportable offence, sought to have them removed when they were not convicted.
The House of Lords found in favour of the police, arguing that the retention was
lawful under applicable legislation, but the European Court of Human Rights
ruled otherwise, holding that the ‘blanket and indiscriminate nature’ of the retention
regime under the legislation did not strike a proper balance between public and
private interests (Smith, 2016).
A further consideration regarding the use of forensic DNA databases is the possibi-
lity of searching for partial matches, also known as ‘familial searching’. This involves
recording and investigating matches that nearly but do not fully coincide, and so
cannot be from the same individual. However, it may be that the crime scene
sample came from a close relative of someone who is on the DNA database, which
provides an investigative lead even when the actual offender’s profile is not on the
database. This significantly extends the scope of ‘cold hit’ matching processes, and
has been used to solve serious crimes in other countries. In many jurisdictions,
forensic procedures legislation does not specifically regulate partial matching, but
appears to allow its use (Smith & Urbas, 2012).
References
Buckland, P. (2014). Honeysett v The Queen (2014): Opinion evidence and reliability: A
sticking point. Adelaide Law Review 35(2), 449.
Cashman, K. & Henning, T. (2012). Lawyers and DNA: Issues in understanding and chal-
lenging the evidence. Current Issues in Criminal Justice 24, 69.
65 Examples include Hibble v B [2012] TASSC 59 (20 September 2012); R v Dean [2006]
SADC 54 (25 May 2006).
66 R v Marper and S (2002) EWCA Civ 1275 (Court of Appeal); R v Marper and S (2004)
UKHL 39 (House of Lords); Case of S and Marper and the United Kingdom [2008] ECHR
1581.
110 Biometrics in criminal trials
Coyle, I., Field, D. & Wenderoth, P. (2009). Pattern recognition and forensic identification:
The presumption of scientific accuracy and other falsehoods. Criminal Law Journal 33, 214.
Edmond, G. (2015). What lawyers should know about the forensic “sciences”. Adelaide Law
Review 37, 33.
Edmond, G. & San Roque, M. (2012). The cool crucible: Forensic science and the frailty of
the criminal trial. Current Issues in Criminal Justice 24(1), 51.
Edmond, G. & San Roque, M. (2014). Honeysett v The Queen: Forensic science, ‘specialised
knowledge’ and the uniform evidence law. Sydney Law Review 36(2), 323.
Edwards, K. (2006). Cold hit complacency: The dangers of DNA databases re-examined.
Current Issues in Criminal Justice 18(1), 92.
Findlay, M. & Grix, J. (2003). Challenging forensic evidence? Observations on the use of
DNA in certain criminal trials. Current Issues in Criminal Justice 14(3), 269.
Gans, J. (2005). DNA identification and rape victims. University of New South Wales Law
Journal 28(1), 272.
Gans, J. (2007a). Much repented: Consent to DNA sampling. University of New South Wales
Law Journal 30(3), 579.
Gans, J. (2007b). Catching Bradley Murdoch: Tweezers, pitchforks and the limits of DNA
sampling. Current Issues in Criminal Justice 19, 34.
Gans, J. (2007c). The Peter Falconio investigation: Needles, hay and DNA. Current Issues in
Criminal Justice 18(3), 415.
Gans, J. (2011). A tale of two High Court forensic cases. Sydney Law Review 33(3), 515.
Gans, J. & Urbas, G. (2002). DNA evidence in the criminal justice system. Trends and
Issues in Crime and Criminal Justice No. 226. 3. Canberra: Australian Institute of
Criminology.
Goodman-Delahunty, J. & Tait, D. (2006). DNA and the changing face of justice. Australian
Journal of Forensic Sciences 38, 97.
Goodman-Delahunty, J. & Wakabayashi, K. (2012). Adversarial forensic science experts: An
empirical study of jury deliberation. Current Issues in Criminal Justice 24(1), 85.
Haesler, A. (2006). DNA in court. Journal of the Judicial Commission of New South Wales.
8(1), 121.
Julian, R. & Kelty, S. (2012). Forensic science and justice: From crime scene to court and
beyond. Current Issues in Criminal Justice 24, 1.
Julian, R., Kelty, S. & Robertson, J. (2012). Get it right the first time: Critical issues at the
crime scene. Current Issues in Criminal Justice 24(1), 25.
Krone, T. (2012). Raising the alarm? Role definition for prosecutors in criminal cases. Australian
Journal of Forensic Sciences 44(1), 15.
Meyers, L. (2007). The problem with DNA. Monitor on Psychology 38, 52.
Rayment, K. (2010). Faith in DNA: The Vincent Report. Journal of Law, Information and
Science 20(1), 238.
Roth, A. (2013). Maryland v. King and the wonderful, horrible DNA revolution in law
enforcement. Ohio State Journal of Criminal Law 11, 295.
Roux, C., Crispino, F. & Ribaux, O. (2012). From forensics to forensic science. Current
Issues in Criminal Justice 24(1), 7.
Smith, M. (2016). DNA Evidence in the Australian Legal System. Chatswood, NSW:
LexisNexis Butterworths.
Smith, M. & Mann, M. (2015). Recent developments in DNA evidence. Trends and Issues
in Crime and Criminal Justice No. 506. 1. Canberra: Australian Institute of Criminology.
Smith, M. & Urbas, G. (2012). Regulating new forms of forensic DNA profiling under
Australian legislation: Familial matching and DNA phenotyping. Australian Journal of Forensic
Sciences 44, 63.
Biometrics in criminal trials 111
Urbas, G. (2012). The High Court and the admissibility of DNA evidence: Aytugrul v The
Queen [2012] HCA 15. Canberra Law Review 11(1), 89.
Whiley, D. & Hocking, B. (2003). DNA: Crime, law and public policy. University of Notre
Dame Australia Law Review 5, 37.
Wise, J. (2010). Providing the CSI treatment: Criminal justice practitioners and the CSI
effect. Current Issues in Criminal Justice 21(3), 383.
7
BIOMETRICS IN CRIMINAL APPEALS
AND POST-CONVICTION REVIEWS
Introduction
This chapter discusses the ways in which biometric identification has featured in
criminal appeals and other reviews of criminal convictions. Appellate review allows
the criminal justice system to recognise and correct errors, including wrongful
convictions and other miscarriages of justice. However, as appeal rights are limited,
other forms of review also play an important role. Discussed in this chapter are
innocence projects, judicial inquiries and review commissions that have used bio-
metrics in their efforts to uncover the truth about past crimes.
Criminal appeals
The first avenue of redress for most convicted offenders who claim that they have
been the subject of a miscarriage of justice is to lodge an appeal. Depending on
conditions imposed on such applications, including time limits and leave require-
ments, this may result in a conviction being overturned. In general, appellate courts
can then either order a re-trial, or order a different verdict. In those rare cases
where actual innocence can be established, the only appropriate outcome is
quashing of the conviction and its replacement with a verdict of acquittal.1
Criminal appeals took on a distinctive form in the early twentieth century with
the establishment in the United Kingdom of the Court of Criminal Appeal in 1907
(Corns & Urbas, 2008). Many jurisdictions empower a court of appeal to overturn
1 Actual innocence need not be established in order for an appeal to succeed, nor is this
often possible. Rather, it is sufficient that enough doubt is cast on the conviction that it
must be regarded as ‘unsafe and unsatisfactory’: M v R (1994) 181 CLR 487; see also
Gipp v R (1998) 194 CLR 106 and Chidiac v R (1991) 171 CLR 432.
Biometrics in criminal appeals and reviews 113
2 Subject to the ‘proviso’ that the conviction may be allowed to stand if the court is of
the opinion that notwithstanding that the appellant has made out one or more of these
grounds, no substantial miscarriage of justice has occurred (Penhallurick, 2003): see, for
example, Criminal Appeal Act 1912 (NSW), s6(1).
3 Button v The Queen [2002] WASCA 35 (25 February 2002), discussed in Goldingham
(2002).
4 See for example, Criminal Appeal Act 1912 (NSW), s12. Appeals against sentence are not
discussed here, but note that questions of finality and double jeopardy also arise in
relation to re-sentencing (Urbas, 2012).
5 An example is the Queensland case involving Frank Button discussed later in this
chapter.
114 Biometrics in criminal appeals and reviews
receive new evidence, including DNA evidence (Hamer, 2015; Milne, 2015; see
also Urbas, 2002).6
a “fresh” if –
b it was not adduced at the trial of the offence; and
c it could not, even with the exercise of reasonable diligence, have been
adduced at the trial; and
d “compelling” if –
e it is reliable; and
f it is substantial; and
g it is highly probative in the context of the issues in dispute at the trial of
the offence.
The form that such evidence might take includes fresh and compelling biometric
analysis. For example, a crime scene sample collected before the trial may not have
been tested, or testing may not have yielded results, due to the limitations of for-
ensic analysis at the time. With advances in techniques, such as testing using small
or degraded biological samples (as discussed in Chapter 3), testing may become
possible years afterwards. This could show that the convicted person is not the
offender. By this time, the convicted person may have already appealed unsuc-
cessfully. The new legislation allows a second or subsequent appeal using the
exonerating biometric evidence (Sangha & Moles, 2015). This is the model used
by some innocence projects, discussed later in this chapter.
6 Mickelberg v The Queen (1989) 167 CLR 259; Eastman v The Queen (2000) 203 CLR 1;
Re Sinanovic’s Application [2001] HCA 40; (2001) 180 ALR 448.
7 Criminal Law Consolidation Act 1935 (SA), s353A inserted by the Statutes Amendment
(Appeals) Act 2013 (SA); and Criminal Code Amendment (Second or Subsequent Appeal for
Fresh and Compelling Evidence) Act 2015 (Tas).
Biometrics in criminal appeals and reviews 115
punishment more than once in relation to the same crime. This is encapsulated in the
rule against double jeopardy (MCCOC, 2003; Burton, 2004; Cowdery, 2005; Griffith
& Roth, 2006) operating through the pleas of autrefois convict and autrefois acquit.8
Historically, the impetus for reform of double jeopardy laws has arisen from
specific high profile cases. In Australia, for example, these arose largely in response
to a child murder case in Queensland. Convicted of the murder of a 17-month-old
baby in 1985, partly on the basis that his distinctive teeth were matched to a bite
mark on the body of the victim, Raymond Carroll appealed successfully, so that
the Queensland Court of Appeal quashed the conviction and entered a verdict of
acquittal. This meant that a second prosecution for murder was precluded by
double jeopardy rules. However, Carroll had given evidence at his trial denying
involvement in the abduction and killing of the child, and on the basis of improved
forensic odontological methods, the prosecution brought a charge of perjury. He
was convicted on that second charge in 2000, on a jury verdict, and again appealed
successfully, with the Court of Appeal accepting that the perjury conviction was in
essence a re-trial of the murder case under a different charge. The High Court
agreed, meaning that Carroll could never be re-convicted.9 Public dissatisfaction
with this outcome together with some academic and political support for a change
in the law led to the enactment of legislation allowing appeals against acquittals in
limited circumstances (Corns, 2003; Burton, 2004).
In Queensland, the provision applies only to a re-trial for murder where there is
fresh and compelling evidence against the acquitted person and it is in the interests
of justice to overturn the acquittal and order a re-trial.10 In New South Wales, an
application may be made in relation to any life sentence offence, including murder
and certain drugs and sexual offences. However, there have been no murder
re-convictions following an overturned acquittal to date.11
Several jurisdictions have adopted similar double jeopardy reforms, preceded by
changes to double jeopardy laws in the United Kingdom (MCCOC, 2003),
allowing re-trials after Crown appeals against acquittal.12 The basis for these
reforms was explained as follows by Lord Justice Auld who conducted a review of
that country’s court system (Auld, 2001) and posed the following questions:
Similar to the laws allowing second and subsequent appeals against convictions in
some jurisdictions, appeals against acquittal are generally limited to those cases in
which there is fresh and compelling evidence of guilt, which could be in the form
of new or improved biometric identification. For example, the evidence in an
initial prosecution case may be insufficient to identify the accused as the offender
beyond reasonable doubt. Later biometric analysis might provide a more conclusive
link, which together with the other available evidence, might then be sufficient to
safely convict the accused.13
However, post-conviction or post-acquittal testing depends on the preservation
of evidence that can be tested (Urbas, 2002; Weathered, 2003; Weathered &
Blewer, 2009; Hamer, 2014). As noted later in relation to the Chamberlain case, the
destruction of forensic samples during or after laboratory testing can deny access to
post-trial testing. This has led to calls for legislative requirements for sample pre-
servation (ALRC, 2003). Despite the possibility of appeals based on fresh and
compelling evidence, either against a conviction or an acquittal, these legal
mechanisms appear to be rarely exercised in practice (Hamer, 2014).
Post-conviction reviews
Innocence projects
The potential for remedying wrongful convictions with the help of biometrics such
as DNA identification has been the impetus for the establishment of many inno-
cence projects, which are usually based in universities, as discussed in Chapter 3
(Hamer, 2014):14
13 Though note that there is some doubt about whether DNA evidence on its own could
ever be sufficient for a conviction: see Ligertwood (2011) and the case of Forbes v The
Queen [2010] HCATrans 120 (18 May 2010).
14 Notes omitted. See also Christian (2001); De Foore (2002); Urbas (2002) and Weath-
ered (2004).
Biometrics in criminal appeals and reviews 117
The original innocence project was established in the United States. Based in the
Cardozo Law School in New York, it has made over 350 exonerations using DNA
evidence. In addition, the work of this and similar bodies has been instrumental in
identifying and addressing the causes of wrongful convictions, with inaccurate
eyewitness identification (discussed in Chapter 6) being the leading cause:15
The University of Bristol has been the leading innocence project in the United
Kingdom and operated a specialist pro bono clinic from 2005 to 2015. Since that
time there have been over 30 other innocence projects established at universities
throughout England and Wales.16 In Australia, similar bodies have been set up at
Griffith University, Edith Cowan University and the University of Technology in
Sydney.17 Most of these follow the emphasis on post-conviction DNA testing shown
to have been successful in overseas jurisdictions such as the United States.
The model of the innocence project has been followed in some cases by the
establishment of an administrative body by governments to review claimed mis-
carriages of justice. The following provides an example of the functions of one
such body, set out in legislation:18
a to consider any application under this Division from an eligible convicted person
and to assess whether the person’s claim of innocence will be affected by DNA
information obtained from biological material specified in the application,
b to arrange, if appropriate, searches for that biological material and the DNA
testing of that biological material,
A convicted person is eligible to make an application to the Panel if, and only
if, the person’s claim of innocence may be affected by DNA information
obtained from biological material specified in the application.
This application was to be assessed by the six member panel, which included a
former judicial officer, a representative of the Attorney-General’s Department, a
victims’ representative, a police representative and prosecution and defence law-
yers. This review panel ceased operations in 2014, apparently not having referred
any cases to a court of criminal appeal for review (Hamer & Edmond, 2013).
19 Crimes (Appeal and Review) Amendment (DNA Review Panel) Act 2006 (NSW), inserting
s89 (since repealed) into the Crimes (Local Courts Appeal and Review) Act 2001 (NSW).
Biometrics in criminal appeals and reviews 119
Chapter 3 (Weathered & Blewer, 2009; Hamer, 2014). This has been discussed as
follows (Weathered, 2013, p. 450):
20 Further literature on miscarriages of justice in the United Kingdom and the United
States is discussed by Roach (2015).
21 Ross v The King (1922) 30 CLR 246 (Knox C.J., Gavan Duffy and Starke JJ, with
Higgins J concurring). Isaacs J delivered a dissenting judgment.
120 Biometrics in criminal appeals and reviews
In the present case, the nude body of a young girl, twelve years of age,
was found lying dead in an alley off Little Collins Street, Melbourne.
Medical evidence disclosed that the cause of death was strangulation from
throttling, that there were bruises and abrasions which indicated violence,
and that there was a recent tear at the lower border of the hymen which
passed completely through the hymen into the tissue of the vaginal wall.
Evidence was also adduced by the Crown from which a jury might infer
that this child had gone into an arcade known as the Eastern Arcade, in
which the prisoner had a wine saloon, that she was there enticed by the
prisoner into his wine saloon and was carnally known and killed by him.
The prisoner, who gave evidence on his own behalf, did not suggest that
he had killed the child in circumstances that might reduce the act from
one of murder to one of manslaughter. He admitted that he had noticed a
young girl, similar in appearance to the dead child, in the Arcade; but he
denied that he had spoken to her or that she had been in his wine saloon,
and he denied that he had anything to do directly or indirectly with the
death of the murdered child. The jury found the prisoner guilty of the
murder of the child.
On appeal, reference was made to ‘evidence which went to identify the hair of the
dead child with that found on certain blankets’, but this was not pivotal in the
Court’s decision. Rather, the majority accepted that the trial judge had given cor-
rect directions to the jury on issues including an alleged confession by the accused.
Special leave to appeal was therefore rejected by the High Court, and the sentence
of execution was carried out a few weeks later.22
However, that was not the last of the legal proceedings arising from the case. A
researcher in the 1990s made the surprising discovery that the hair samples
collected at the time of the girl’s death were still in the police archives, and
re-testing was done by both the Victorian Institute of Forensic Medicine and the
Australian Federal Police laboratory. This confirmed that the hair found on
blankets in the defendant’s home did not match the scalp sample of the dead girl.
The Victorian Attorney-General referred the matter to the Supreme Court in
2007, which concluded unanimously that the conviction could not stand.23
Crucial to this finding was a report by Dr James Robertson, then Director of
Forensic Services at the Australian Federal Police, and an expert in forensic hair
comparisons, whose analysis concluded that ‘the hairs recovered from the brown-
grey blanket could not have come from the deceased, Tirtschke’.24 Relatives of
both the prisoner and the victim signed a petition for mercy, and the Governor
22 The High Court decision is dated 5 April 1922, and Ross was hanged on 24 April 1922.
23 Re Colin Campbell Ross [2007] VSC 572 (20 December 2007) (Teague, Cummins and
Coldrey JJ).
24 The forensic report of Dr James Robertson is included in full in the Supreme Court’s
judgment (at [80]), in recognition of its importance in resolving the case.
Biometrics in criminal appeals and reviews 121
The Chamberlains
The Chamberlain case has been highly influential on the role of forensics in criminal
proceedings. After two coronial inquiries into the 1980 disappearance of their baby
daughter, Azaria, from a camping ground near Uluru in central Australia, Lindy
and Michael Chamberlain were committed to stand trial in the Supreme Court of
the Northern Territory. The prosecution case was that Lindy had killed Azaria in
the family car and the couple had disposed of the body. The defence argued at trial
that Azaria had been taken by a dingo. The prosecution case relied heavily on
forensics, and after a highly publicised jury trial, Lindy was convicted of murder
with Michael convicted as an accessory.
Appeals to higher courts were unsuccessful.26 Continuing public disquiet with
the convictions led to the establishment of a Royal Commission in 1987, which
found profound flaws in the forensic evidence adduced by the prosecution. This
included an alleged bloody handprint on Azaria’s clothing, an expert’s purported
identification of damage to the clothing as caused by scissors rather than dingo
teeth and, most critically, the identification of supposed foetal blood under the
dashboard of the car. This was systematically discredited by the Commissioner,
who observed that:27 ‘evidence was given at trial by experts who did not have the
experience, facilities or resources necessary to enable them to express reliable
opinions on some of the novel and complex scientific issues which arose for
consideration’.
The Northern Territory Supreme Court, sitting as a Court of Criminal Appeal
and acting on recommendations of the Morling Commission, quashed the con-
victions in 1988.28 However, the cause of death was not officially determined to be
due to a dingo taking Azaria until a fourth coronial inquest was completed in
2012.29 The legacy of the Chamberlain saga is arguably that courts have become
more willing to scrutinise forensic evidence, that forensic experts have improved
25 J. Silvester. (2008). Ross cleared of murder nearly 90 years ago. The Age. Retrieved
from http://www.theage.com.au/news/national/bcrimeb-man-cleared-of-murder-86-
years-after-he-was-executed/2008/05/26/1211653938453.html
26 Re Alice Lynne Chamberlain and Michael Leigh Chamberlain v R (1983) 72 FLR 1; Cham-
berlain v R (No. 2) (1984) 153 CLR 521. The High Court appeal failed with a 3:2
majority upholding the conviction.
27 Report of the Commissioner the Hon. Mr. Justice T.R. Morling / Royal Commission of Inquiry
into Chamberlain Convictions (1987), 340–1, cited by Warren (2009).
28 Reference under s.433A of the Criminal Code by the Attorney-General for the Northern Terri-
tory of Australia of Convictions of Alice Lynne Chamberlain and Michael Leigh Chamberlain
[1988] NTSC 64 (15 September 1988). Both Lindy and Michael Chamberlain were
pardoned in 1987, though this did not legally overturn the convictions.
29 Inquest into the death of Azaria Chantel Loren Chamberlain [2012] NTMC 020. The third
inquest, held in 1995, had returned an open finding.
122 Biometrics in criminal appeals and reviews
their processes and clarified that they must act impartially in assisting the court
rather than the prosecution and that both the judicial system and extra-judicial
means of review are arguably more willing to re-examine past cases to identify and
correct miscarriages of justice.
Edward Splatt
Edward Splatt was convicted of murder in 1978 and spent six and a half years in
prison before being released on the recommendation of a Royal Commission,
which was followed by an ex gratia payment of $300,000. The case against him was
circumstantial and largely based on scientific analysis of paint, wood, birdseed and
biscuit particles collected at the crime scene. Upon reviewing the case, the Royal
Commissioner concluded that it would be ‘unjust and dangerous for the verdict to
stand’ (ALRC, 1985; Dioso-Villa, 2014). The main reasons for this conclusion
were that the investigation and forensic analysis were conducted by the same police
officers, so that there was a lack of scientific objectivity and a reluctance to consider
exculpating rather than incriminating interpretations of the evidence.30 Following
this case, and the Chamberlain case in which South Australian forensic technicians
were also involved, forensic procedures were significantly reviewed and reformed.
In particular, expert guidelines now emphasise that:31
The role of the expert witness is to provide relevant and impartial evidence in
his or her area of expertise. An expert should never mislead the Court or
become an advocate for the cause of the party that has retained the expert.
Alexander McLeod-Lindsay
Alexander McLeod-Lindsay came home from his work one day in 1964 to find his
wife and son severely beaten. Both survived, and the wife described the attacker.
However, police suspected McLeod-Lindsay, and developed a theory that he had
slipped away from the hotel and returned there unnoticed after attacking his
family. Blood on his jacket was said to be ‘impact splatter’ that was deposited
during the attack. McLeod-Lindsay was convicted of attempted murder and served
almost ten years in prison before being released. Despite appealing to higher courts
for review, the convictions stood, despite expert scientists arguing that the blood
on the jacket displayed clotting, and therefore was most likely deposited when
30 B. Littley. (2012). Someone got away with murder. Adelaide Advertiser, 27 January.
31 See, for example, Federal Court of Australia, Expert Evidence Practice Note (GPN-EXPT),
25 October 2016.
Biometrics in criminal appeals and reviews 123
McLeod-Lindsay held his wife in his arms upon coming home to the horrific
scene.32 It was not until a further inquiry in 1990 that a final exoneration and
compensation were awarded by the state.33
Frank Button
The leading example in Australia of a DNA-based exoneration is the case of Frank
Button, in which the Queensland Court of Appeal quashed the defendant’s rape
conviction when presented with post-trial DNA analysis indicating that someone
other than Button was the rapist. The lead judgment stated:34
As I said in the course of argument, today is a black day in the history of the
administration of criminal justice in Queensland. The appellant was convicted
of rape by a jury and has spent some approximate 10 months in custody in
consequence of that conviction. DNA testing carried out at the insistence of
his lawyers after that jury verdict has now established that he was not the
perpetrator of the crime in question, and indeed the recent DNA testing
would appear to have identified some other person as the perpetrator of that
crime. What is of major concern to this Court is the fact that that evidence
was not available at the trial.
What is disturbing is that the investigating authorities had also taken pos-
session of bedding from the bed on which the offence occurred, and delivered
those exhibits to the John Tonge Centre. No testing of that bedding was
carried out prior to trial. The explanation given was that it would not be of
material assistance in identifying the appellant as the perpetrator of the crime.
32 Report of the Inquiry held under Section 475 of the Crimes Act 1900 into the Conviction of
Alexander McLeod-Lindsay, 1969.
33 M. Brown, ‘Exonerated 26 years after his conviction’ (Sydney Morning Herald, 21 Sep-
tember 2009), written on the death of Alexander McLeod-Lindsay two days earlier.
34 R v Button [2001] QCA 133 (10 April 2001), (Williams JA, White and Holmes JJ con-
curring), perhaps Australia’s only DNA-based exoneration appeal (Roach 2015). The
judge’s words were adopted by an Australian Broadcasting Corporation documentary
about the case, ‘A Black Day for Justice’ (see discussion in Chapter 3 and in Smith
2015).
124 Biometrics in criminal appeals and reviews
This case illustrates that forensic science, including biometrics, can only be of
consistent and reliable assistance in criminal proceedings if analysis is conducted in a
comprehensive and scientifically robust manner. The risk otherwise is that mis-
carriages of justice will occur, and they may not always be amenable to remedial
justice through criminal appeals of other forms of post-conviction review.35
Validation of scientific techniques: While some new areas such as DNA identifi-
cation have been reasonably well validated through court cases assessing their
scientific basis, this is less true for newer techniques such as facial or body
mapping;
Standard protocols: Not all types of biometric identification operate according to
clear and agreed processes for the collection and analysis of material e.g. voice
identification may be based on ad hoc expertise rather than a standard approach
across cases;
Inaccuracy and bias: Conclusions that appear to be based on scientific analysis
may not disclose matters affecting their accuracy, the language used may be
highly technical without adding to the accuracy of the analysis, and sample
biases may not be disclosed where they are known.
35 Not discussed here are other noteworthy miscarriage of justice cases involving forensics,
such as those involving John Button and Andrew Mallard in Western Australia, and
Gordon Wood in New South Wales, as these cases did not rely on biometrics as the
principal means of identification relied on by the prosecution.
Biometrics in criminal appeals and reviews 125
First, the admissibility rules relating to relevance, opinion and discretion are
open to interpretations permitting the rigorous consideration of forensic evi-
dence, to ensure that it is based on theoretical and/or empirical grounds and
that it is expressed transparently in a way that enables the trier of fact, with
appropriate directions from the trial judge, to take it rationally into account
when considering the criminal standard of proof.
Secondly, standards governing appellate review (including post-conviction
review) are open to interpretations that could ensure that forensic evidence is
carefully scrutinised on appeal, not only to determine its admissibility and use
but also in determining whether the criminal standard of proof has been
satisfied. Thirdly, the adversary process may be limited by time and resources
but it undoubtedly has the potential to provide a powerful scrutiny of forensic
evidence.
And finally, as far as the common lack of scientific expertise among the
judges and lawyers who must try to comprehend and evaluate forensic evi-
dence is concerned, one might argue that in many cases it is not necessary for
laypersons (judges and juries) to follow all the technicalities of a forensic pro-
cess and it is enough to appreciate the possibilities of error in determining
admissibility and proof. It is only where the very basis of scientific evidence is
being disputed that persons with a background in that area of science may be
required to adjudicate the dispute.
This suggests that it is within the capacity of the legal system, assisted by the for-
ensic sciences, to make the best use of biometrics in the courtroom, in criminal
trials and appeals, and in other forms of post-conviction review.
References
Auld, R. (2001). Review of the Criminal Courts of England and Wales. London: UK Stationery
Office.
Australian Capital Territory Department of Justice and Community Safety (ACT JACS).
(2015). Double Jeopardy Information Paper. Retrieved from http://www.justice.act.gov.au/
review/view/38/title/double-jeopardy-information-paper
Australian Law Reform Commission (ALRC). (2003). Essentially Yours: The Protection of
Human Genetic Information in Australia Report 96. Retrieved from http://www.austlii.edu.
au/au/other/lawreform/ALRC/2003/96.html
Australian Law Reform Commission (ALRC). (1985). Compensation for imprisonment.
Australian Law Reform Commission Reform Journal 39, 105.